[ 564.732054] env[59814]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 565.186701] env[59857]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 566.704267] env[59857]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=59857) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 566.704632] env[59857]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=59857) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 566.704727] env[59857]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=59857) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 566.705026] env[59857]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs [ 566.706122] env[59857]: WARNING nova.servicegroup.api [-] Report interval must be less than service down time. Current config: . Setting service_down_time to: 300 [ 566.820720] env[59857]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=59857) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} [ 566.830531] env[59857]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.010s {{(pid=59857) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} [ 566.930985] env[59857]: INFO nova.virt.driver [None req-d20262c8-ff55-45db-a7d9-c51f24b7721f None None] Loading compute driver 'vmwareapi.VMwareVCDriver' [ 567.003798] env[59857]: DEBUG oslo_concurrency.lockutils [-] Acquiring lock "oslo_vmware_api_lock" by "oslo_vmware.api.VMwareAPISession._create_session" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 567.003967] env[59857]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" acquired by "oslo_vmware.api.VMwareAPISession._create_session" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 567.004072] env[59857]: DEBUG oslo_vmware.service [-] Creating suds client with soap_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk' and wsdl_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk/vimService.wsdl' {{(pid=59857) __init__ /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:242}} [ 570.194399] env[59857]: DEBUG oslo_vmware.service [-] Invoking ServiceInstance.RetrieveServiceContent with opID=oslo.vmware-93c5f67e-09ed-4734-9c06-af77fbf355bb {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 570.209992] env[59857]: DEBUG oslo_vmware.api [-] Logging into host: vc1.osci.c.eu-de-1.cloud.sap. {{(pid=59857) _create_session /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:242}} [ 570.210180] env[59857]: DEBUG oslo_vmware.service [-] Invoking SessionManager.Login with opID=oslo.vmware-6a3bbfe7-1f15-445e-a6c8-a0047a16ed02 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 570.234599] env[59857]: INFO oslo_vmware.api [-] Successfully established new session; session ID is 634c8. [ 570.234762] env[59857]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" "released" by "oslo_vmware.api.VMwareAPISession._create_session" :: held 3.231s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 570.235326] env[59857]: INFO nova.virt.vmwareapi.driver [None req-d20262c8-ff55-45db-a7d9-c51f24b7721f None None] VMware vCenter version: 7.0.3 [ 570.238753] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d954f10-fbcb-4003-9252-bf4361e5a2c8 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 570.256571] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af75a1f8-6fca-4700-94d5-ca084696ab1f {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 570.263026] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a881170-8339-4b96-9f22-2f03c8b3aed9 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 570.269880] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88d2dd12-b62b-40bb-835b-7db6337dcc7e {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 570.283402] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d897e255-8769-4841-b7d2-fb7d696a3f56 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 570.289858] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-180247ea-657b-4b86-ac8c-f810f099fd7e {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 570.321462] env[59857]: DEBUG oslo_vmware.service [-] Invoking ExtensionManager.FindExtension with opID=oslo.vmware-1add3b95-31fb-4563-ab1e-5e751e766e40 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 570.327315] env[59857]: DEBUG nova.virt.vmwareapi.driver [None req-d20262c8-ff55-45db-a7d9-c51f24b7721f None None] Extension org.openstack.compute already exists. {{(pid=59857) _register_openstack_extension /opt/stack/nova/nova/virt/vmwareapi/driver.py:214}} [ 570.330020] env[59857]: INFO nova.compute.provider_config [None req-d20262c8-ff55-45db-a7d9-c51f24b7721f None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. [ 570.347440] env[59857]: DEBUG nova.context [None req-d20262c8-ff55-45db-a7d9-c51f24b7721f None None] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),b3c1ceaa-bc48-48ee-8b60-930585e76a41(cell1) {{(pid=59857) load_cells /opt/stack/nova/nova/context.py:464}} [ 570.349462] env[59857]: DEBUG oslo_concurrency.lockutils [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 570.349674] env[59857]: DEBUG oslo_concurrency.lockutils [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 570.350438] env[59857]: DEBUG oslo_concurrency.lockutils [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 570.350786] env[59857]: DEBUG oslo_concurrency.lockutils [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 570.350973] env[59857]: DEBUG oslo_concurrency.lockutils [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 570.351970] env[59857]: DEBUG oslo_concurrency.lockutils [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 570.365868] env[59857]: DEBUG oslo_db.sqlalchemy.engines [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=59857) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 570.366273] env[59857]: DEBUG oslo_db.sqlalchemy.engines [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=59857) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 570.373232] env[59857]: ERROR nova.db.main.api [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] No DB access allowed in nova-compute: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 570.373232] env[59857]: result = function(*args, **kwargs) [ 570.373232] env[59857]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 570.373232] env[59857]: return func(*args, **kwargs) [ 570.373232] env[59857]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 570.373232] env[59857]: result = fn(*args, **kwargs) [ 570.373232] env[59857]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 570.373232] env[59857]: return f(*args, **kwargs) [ 570.373232] env[59857]: File "/opt/stack/nova/nova/objects/service.py", line 546, in _db_service_get_minimum_version [ 570.373232] env[59857]: return db.service_get_minimum_version(context, binaries) [ 570.373232] env[59857]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 570.373232] env[59857]: _check_db_access() [ 570.373232] env[59857]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 570.373232] env[59857]: stacktrace = ''.join(traceback.format_stack()) [ 570.373232] env[59857]: [ 570.375264] env[59857]: ERROR nova.db.main.api [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] No DB access allowed in nova-compute: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 570.375264] env[59857]: result = function(*args, **kwargs) [ 570.375264] env[59857]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 570.375264] env[59857]: return func(*args, **kwargs) [ 570.375264] env[59857]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 570.375264] env[59857]: result = fn(*args, **kwargs) [ 570.375264] env[59857]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 570.375264] env[59857]: return f(*args, **kwargs) [ 570.375264] env[59857]: File "/opt/stack/nova/nova/objects/service.py", line 546, in _db_service_get_minimum_version [ 570.375264] env[59857]: return db.service_get_minimum_version(context, binaries) [ 570.375264] env[59857]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 570.375264] env[59857]: _check_db_access() [ 570.375264] env[59857]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 570.375264] env[59857]: stacktrace = ''.join(traceback.format_stack()) [ 570.375264] env[59857]: [ 570.376068] env[59857]: WARNING nova.objects.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] Failed to get minimum service version for cell 00000000-0000-0000-0000-000000000000 [ 570.376068] env[59857]: WARNING nova.objects.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] Failed to get minimum service version for cell b3c1ceaa-bc48-48ee-8b60-930585e76a41 [ 570.376465] env[59857]: DEBUG oslo_concurrency.lockutils [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] Acquiring lock "singleton_lock" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 570.376465] env[59857]: DEBUG oslo_concurrency.lockutils [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] Acquired lock "singleton_lock" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 570.376662] env[59857]: DEBUG oslo_concurrency.lockutils [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] Releasing lock "singleton_lock" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 570.377032] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] Full set of CONF: {{(pid=59857) _wait_for_exit_or_signal /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:362}} [ 570.377185] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] ******************************************************************************** {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2589}} [ 570.377316] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] Configuration options gathered from: {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2590}} [ 570.377449] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] command line args: ['--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-cpu-common.conf', '--config-file', '/etc/nova/nova-cpu-1.conf'] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2591}} [ 570.377668] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2592}} [ 570.377795] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] ================================================================================ {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2594}} [ 570.378012] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] allow_resize_to_same_host = True {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.378190] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] arq_binding_timeout = 300 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.378320] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] backdoor_port = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.378445] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] backdoor_socket = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.378621] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] block_device_allocate_retries = 60 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.378766] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] block_device_allocate_retries_interval = 3 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.378932] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cert = self.pem {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.379108] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] compute_driver = vmwareapi.VMwareVCDriver {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.379280] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] compute_monitors = [] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.379443] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] config_dir = [] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.379611] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] config_drive_format = iso9660 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.379742] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.379905] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] config_source = [] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.380143] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] console_host = devstack {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.380342] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] control_exchange = nova {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.380503] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cpu_allocation_ratio = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.380662] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] daemon = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.380830] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] debug = True {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.380985] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] default_access_ip_network_name = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.381167] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] default_availability_zone = nova {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.381325] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] default_ephemeral_format = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.381577] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.381736] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] default_schedule_zone = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.381890] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] disk_allocation_ratio = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.382061] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] enable_new_services = True {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.382241] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] enabled_apis = ['osapi_compute'] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.382404] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] enabled_ssl_apis = [] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.382575] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] flat_injected = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.382752] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] force_config_drive = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.382912] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] force_raw_images = True {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.383095] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] graceful_shutdown_timeout = 5 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.383260] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] heal_instance_info_cache_interval = 60 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.383480] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] host = cpu-1 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.383656] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] initial_cpu_allocation_ratio = 4.0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.383822] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] initial_disk_allocation_ratio = 1.0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.384149] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] initial_ram_allocation_ratio = 1.0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.384211] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.384375] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] instance_build_timeout = 0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.384535] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] instance_delete_interval = 300 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.384725] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] instance_format = [instance: %(uuid)s] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.384897] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] instance_name_template = instance-%08x {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.385071] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] instance_usage_audit = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.385242] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] instance_usage_audit_period = month {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.385457] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.385583] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] instances_path = /opt/stack/data/nova/instances {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.385729] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] internal_service_availability_zone = internal {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.385903] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] key = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.386092] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] live_migration_retry_count = 30 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.386205] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] log_config_append = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.386373] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.386527] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] log_dir = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.386682] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] log_file = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.386815] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] log_options = True {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.387017] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] log_rotate_interval = 1 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.387184] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] log_rotate_interval_type = days {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.387386] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] log_rotation_type = none {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.387470] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.387594] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.387758] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.387919] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.388058] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.388220] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] long_rpc_timeout = 1800 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.388376] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] max_concurrent_builds = 10 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.388538] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] max_concurrent_live_migrations = 1 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.388744] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] max_concurrent_snapshots = 5 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.388835] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] max_local_block_devices = 3 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.388994] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] max_logfile_count = 30 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.389168] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] max_logfile_size_mb = 200 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.389326] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] maximum_instance_delete_attempts = 5 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.389491] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] metadata_listen = 0.0.0.0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.389660] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] metadata_listen_port = 8775 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.389830] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] metadata_workers = 2 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.390037] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] migrate_max_retries = -1 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.390211] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] mkisofs_cmd = genisoimage {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.390419] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] my_block_storage_ip = 10.180.1.21 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.390550] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] my_ip = 10.180.1.21 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.390734] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] network_allocate_retries = 0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.390925] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.391105] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] osapi_compute_listen = 0.0.0.0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.391271] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] osapi_compute_listen_port = 8774 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.391438] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] osapi_compute_unique_server_name_scope = {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.391611] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] osapi_compute_workers = 2 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.391780] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] password_length = 12 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.391946] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] periodic_enable = True {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.392122] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] periodic_fuzzy_delay = 60 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.392294] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] pointer_model = usbtablet {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.392463] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] preallocate_images = none {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.392629] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] publish_errors = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.392752] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] pybasedir = /opt/stack/nova {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.392909] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] ram_allocation_ratio = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.393081] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] rate_limit_burst = 0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.393249] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] rate_limit_except_level = CRITICAL {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.393407] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] rate_limit_interval = 0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.393568] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] reboot_timeout = 0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.393727] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] reclaim_instance_interval = 0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.393881] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] record = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.394060] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] reimage_timeout_per_gb = 60 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.394227] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] report_interval = 120 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.394388] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] rescue_timeout = 0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.394611] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] reserved_host_cpus = 0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.394775] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] reserved_host_disk_mb = 0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.394901] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] reserved_host_memory_mb = 512 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.395084] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] reserved_huge_pages = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.395247] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] resize_confirm_window = 0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.395453] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] resize_fs_using_block_device = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.395559] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] resume_guests_state_on_host_boot = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.395726] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.395888] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] rpc_response_timeout = 60 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.396056] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] run_external_periodic_tasks = True {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.396249] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] running_deleted_instance_action = reap {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.396382] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] running_deleted_instance_poll_interval = 1800 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.396539] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] running_deleted_instance_timeout = 0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.396695] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] scheduler_instance_sync_interval = 120 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.396850] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] service_down_time = 300 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.397128] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] servicegroup_driver = db {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.397382] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] shelved_offload_time = 0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.397577] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] shelved_poll_interval = 3600 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.397756] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] shutdown_timeout = 0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.397923] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] source_is_ipv6 = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.398097] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] ssl_only = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.398361] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] state_path = /opt/stack/data/n-cpu-1 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.398530] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] sync_power_state_interval = 600 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.398694] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] sync_power_state_pool_size = 1000 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.398867] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] syslog_log_facility = LOG_USER {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.399033] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] tempdir = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.399201] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] timeout_nbd = 10 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.399417] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] transport_url = **** {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.399526] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] update_resources_interval = 0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.399687] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] use_cow_images = True {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.399851] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] use_eventlog = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.400042] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] use_journal = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.400209] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] use_json = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.400367] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] use_rootwrap_daemon = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.400526] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] use_stderr = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.400721] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] use_syslog = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.400876] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vcpu_pin_set = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.401057] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vif_plugging_is_fatal = True {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.401227] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vif_plugging_timeout = 300 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.401392] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] virt_mkfs = [] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.401553] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] volume_usage_poll_interval = 0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.401711] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] watch_log_file = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.401882] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] web = /usr/share/spice-html5 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 570.402083] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_concurrency.disable_process_locking = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.402395] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_concurrency.lock_path = /opt/stack/data/n-cpu-1 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.402585] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.402774] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.402948] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_messaging_metrics.metrics_process_name = {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.403131] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.403297] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.403481] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] api.auth_strategy = keystone {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.403648] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] api.compute_link_prefix = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.403824] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.403996] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] api.dhcp_domain = novalocal {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.404203] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] api.enable_instance_password = True {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.404369] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] api.glance_link_prefix = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.404536] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.404733] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] api.instance_list_cells_batch_strategy = distributed {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.404903] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] api.instance_list_per_project_cells = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.405083] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] api.list_records_by_skipping_down_cells = True {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.405243] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] api.local_metadata_per_cell = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.405407] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] api.max_limit = 1000 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.405578] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] api.metadata_cache_expiration = 15 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.405768] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] api.neutron_default_tenant_id = default {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.405936] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] api.use_forwarded_for = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.406113] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] api.use_neutron_default_nets = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.406281] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.406433] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] api.vendordata_dynamic_failure_fatal = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.406601] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.406775] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] api.vendordata_dynamic_ssl_certfile = {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.406948] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] api.vendordata_dynamic_targets = [] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.407125] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] api.vendordata_jsonfile_path = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.407311] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] api.vendordata_providers = ['StaticJSON'] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.407534] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cache.backend = dogpile.cache.memcached {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.407716] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cache.backend_argument = **** {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.407890] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cache.config_prefix = cache.oslo {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.408071] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cache.dead_timeout = 60.0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.408237] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cache.debug_cache_backend = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.408398] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cache.enable_retry_client = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.408562] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cache.enable_socket_keepalive = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.408732] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cache.enabled = True {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.408988] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cache.expiration_time = 600 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.409086] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cache.hashclient_retry_attempts = 2 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.409255] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cache.hashclient_retry_delay = 1.0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.409413] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cache.memcache_dead_retry = 300 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.409578] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cache.memcache_password = {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.409739] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.409925] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.410109] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cache.memcache_pool_maxsize = 10 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.410272] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cache.memcache_pool_unused_timeout = 60 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.410430] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cache.memcache_sasl_enabled = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.410608] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cache.memcache_servers = ['localhost:11211'] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.410771] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cache.memcache_socket_timeout = 1.0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.410934] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cache.memcache_username = {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.411114] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cache.proxies = [] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.411278] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cache.retry_attempts = 2 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.411441] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cache.retry_delay = 0.0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.411675] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cache.socket_keepalive_count = 1 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.411766] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cache.socket_keepalive_idle = 1 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.411928] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cache.socket_keepalive_interval = 1 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.412204] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cache.tls_allowed_ciphers = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.412267] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cache.tls_cafile = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.412400] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cache.tls_certfile = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.412558] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cache.tls_enabled = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.412735] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cache.tls_keyfile = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.412903] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cinder.auth_section = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.413114] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cinder.auth_type = password {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.413362] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cinder.cafile = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.413603] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cinder.catalog_info = volumev3::publicURL {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.413782] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cinder.certfile = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.413952] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cinder.collect_timing = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.414133] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cinder.cross_az_attach = True {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.414299] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cinder.debug = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.414458] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cinder.endpoint_template = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.414640] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cinder.http_retries = 3 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.414809] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cinder.insecure = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.414967] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cinder.keyfile = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.415150] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cinder.os_region_name = RegionOne {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.415337] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cinder.split_loggers = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.415499] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cinder.timeout = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.415704] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.415876] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] compute.cpu_dedicated_set = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.416048] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] compute.cpu_shared_set = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.416218] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] compute.image_type_exclude_list = [] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.416382] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] compute.live_migration_wait_for_vif_plug = True {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.416548] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] compute.max_concurrent_disk_ops = 0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.416710] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] compute.max_disk_devices_to_attach = -1 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.416914] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.417053] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.417227] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] compute.resource_provider_association_refresh = 300 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.417388] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] compute.shutdown_retry_interval = 10 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.417567] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.417745] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] conductor.workers = 2 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.417951] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] console.allowed_origins = [] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.418125] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] console.ssl_ciphers = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.418299] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] console.ssl_minimum_version = default {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.418473] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] consoleauth.token_ttl = 600 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.418639] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cyborg.cafile = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.418797] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cyborg.certfile = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.418960] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cyborg.collect_timing = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.419150] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cyborg.connect_retries = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.419289] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cyborg.connect_retry_delay = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.419446] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cyborg.endpoint_override = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.419609] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cyborg.insecure = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.419767] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cyborg.keyfile = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.419926] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cyborg.max_version = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.420094] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cyborg.min_version = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.420253] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cyborg.region_name = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.420410] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cyborg.service_name = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.420578] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cyborg.service_type = accelerator {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.420740] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cyborg.split_loggers = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.420899] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cyborg.status_code_retries = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.421068] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cyborg.status_code_retry_delay = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.421232] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cyborg.timeout = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.421411] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.421570] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] cyborg.version = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.421810] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] database.backend = sqlalchemy {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.421921] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] database.connection = **** {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.422110] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] database.connection_debug = 0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.422284] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] database.connection_parameters = {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.422450] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] database.connection_recycle_time = 3600 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.422617] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] database.connection_trace = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.422779] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] database.db_inc_retry_interval = True {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.422940] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] database.db_max_retries = 20 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.423114] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] database.db_max_retry_interval = 10 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.423280] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] database.db_retry_interval = 1 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.423448] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] database.max_overflow = 50 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.423614] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] database.max_pool_size = 5 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.423784] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] database.max_retries = 10 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.423946] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] database.mysql_enable_ndb = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.424128] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] database.mysql_sql_mode = TRADITIONAL {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.424289] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] database.mysql_wsrep_sync_wait = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.424449] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] database.pool_timeout = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.424644] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] database.retry_interval = 10 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.424814] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] database.slave_connection = **** {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.424982] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] database.sqlite_synchronous = True {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.425157] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] database.use_db_reconnect = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.425338] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] api_database.backend = sqlalchemy {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.425909] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] api_database.connection = **** {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.426118] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] api_database.connection_debug = 0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.426301] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] api_database.connection_parameters = {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.426472] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] api_database.connection_recycle_time = 3600 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.426696] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] api_database.connection_trace = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.426811] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] api_database.db_inc_retry_interval = True {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.426982] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] api_database.db_max_retries = 20 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.427166] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] api_database.db_max_retry_interval = 10 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.427334] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] api_database.db_retry_interval = 1 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.427505] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] api_database.max_overflow = 50 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.427670] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] api_database.max_pool_size = 5 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.427841] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] api_database.max_retries = 10 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.428011] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] api_database.mysql_enable_ndb = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.428203] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.428365] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] api_database.mysql_wsrep_sync_wait = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.428531] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] api_database.pool_timeout = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.428700] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] api_database.retry_interval = 10 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.428861] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] api_database.slave_connection = **** {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.429039] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] api_database.sqlite_synchronous = True {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.429283] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] devices.enabled_mdev_types = [] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.429394] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.429555] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] ephemeral_storage_encryption.enabled = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.429720] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] ephemeral_storage_encryption.key_size = 512 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.429890] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] glance.api_servers = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.430065] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] glance.cafile = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.430230] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] glance.certfile = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.430394] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] glance.collect_timing = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.430553] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] glance.connect_retries = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.430712] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] glance.connect_retry_delay = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.430878] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] glance.debug = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.431053] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] glance.default_trusted_certificate_ids = [] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.431220] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] glance.enable_certificate_validation = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.431382] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] glance.enable_rbd_download = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.431543] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] glance.endpoint_override = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.431710] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] glance.insecure = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.431934] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] glance.keyfile = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.432033] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] glance.max_version = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.432199] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] glance.min_version = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.432363] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] glance.num_retries = 3 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.432563] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] glance.rbd_ceph_conf = {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.432736] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] glance.rbd_connect_timeout = 5 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.432906] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] glance.rbd_pool = {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.433088] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] glance.rbd_user = {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.433253] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] glance.region_name = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.433413] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] glance.service_name = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.433582] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] glance.service_type = image {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.433746] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] glance.split_loggers = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.433904] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] glance.status_code_retries = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.434071] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] glance.status_code_retry_delay = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.434234] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] glance.timeout = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.434409] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.434577] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] glance.verify_glance_signatures = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.434761] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] glance.version = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.434935] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] guestfs.debug = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.435124] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] hyperv.config_drive_cdrom = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.435311] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] hyperv.config_drive_inject_password = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.435490] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.435663] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] hyperv.enable_instance_metrics_collection = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.435822] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] hyperv.enable_remotefx = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.435992] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] hyperv.instances_path_share = {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.436174] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] hyperv.iscsi_initiator_list = [] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.436379] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] hyperv.limit_cpu_features = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.436558] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.436755] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.436923] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] hyperv.power_state_check_timeframe = 60 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.437098] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] hyperv.power_state_event_polling_interval = 2 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.437271] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.437467] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] hyperv.use_multipath_io = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.437639] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] hyperv.volume_attach_retry_count = 10 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.437805] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] hyperv.volume_attach_retry_interval = 5 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.437964] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] hyperv.vswitch_name = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.438145] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.438317] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] mks.enabled = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.438707] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.438973] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] image_cache.manager_interval = 2400 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.439108] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] image_cache.precache_concurrency = 1 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.439263] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] image_cache.remove_unused_base_images = True {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.439434] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.439634] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.439851] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] image_cache.subdirectory_name = _base {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.440055] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] ironic.api_max_retries = 60 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.440226] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] ironic.api_retry_interval = 2 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.440389] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] ironic.auth_section = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.440554] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] ironic.auth_type = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.440747] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] ironic.cafile = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.440916] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] ironic.certfile = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.441096] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] ironic.collect_timing = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.441260] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] ironic.connect_retries = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.441420] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] ironic.connect_retry_delay = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.441578] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] ironic.endpoint_override = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.441771] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] ironic.insecure = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.441936] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] ironic.keyfile = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.442110] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] ironic.max_version = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.442270] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] ironic.min_version = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.442427] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] ironic.partition_key = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.442619] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] ironic.peer_list = [] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.442783] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] ironic.region_name = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.442949] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] ironic.serial_console_state_timeout = 10 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.443122] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] ironic.service_name = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.443292] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] ironic.service_type = baremetal {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.443454] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] ironic.split_loggers = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.443615] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] ironic.status_code_retries = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.443775] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] ironic.status_code_retry_delay = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.443934] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] ironic.timeout = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.444133] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.445110] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] ironic.version = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.445110] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.445110] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] key_manager.fixed_key = **** {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.445110] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.445110] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] barbican.barbican_api_version = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.445365] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] barbican.barbican_endpoint = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.445493] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] barbican.barbican_endpoint_type = public {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.445693] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] barbican.barbican_region_name = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.445871] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] barbican.cafile = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.446050] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] barbican.certfile = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.446221] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] barbican.collect_timing = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.446384] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] barbican.insecure = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.446543] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] barbican.keyfile = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.446708] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] barbican.number_of_retries = 60 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.446874] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] barbican.retry_delay = 1 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.447064] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] barbican.send_service_user_token = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.447246] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] barbican.split_loggers = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.447447] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] barbican.timeout = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.447613] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] barbican.verify_ssl = True {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.447772] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] barbican.verify_ssl_path = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.447940] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] barbican_service_user.auth_section = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.448116] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] barbican_service_user.auth_type = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.448276] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] barbican_service_user.cafile = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.448432] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] barbican_service_user.certfile = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.448599] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] barbican_service_user.collect_timing = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.448760] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] barbican_service_user.insecure = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.448917] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] barbican_service_user.keyfile = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.449091] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] barbican_service_user.split_loggers = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.449252] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] barbican_service_user.timeout = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.449417] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vault.approle_role_id = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.449576] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vault.approle_secret_id = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.449734] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vault.cafile = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.449889] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vault.certfile = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.450066] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vault.collect_timing = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.450256] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vault.insecure = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.450418] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vault.keyfile = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.450592] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vault.kv_mountpoint = secret {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.450760] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vault.kv_version = 2 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.450921] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vault.namespace = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.451089] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vault.root_token_id = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.451253] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vault.split_loggers = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.451410] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vault.ssl_ca_crt_file = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.451569] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vault.timeout = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.451735] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vault.use_ssl = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.451909] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.452088] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] keystone.cafile = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.452253] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] keystone.certfile = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.452413] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] keystone.collect_timing = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.452627] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] keystone.connect_retries = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.452816] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] keystone.connect_retry_delay = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.452980] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] keystone.endpoint_override = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.453177] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] keystone.insecure = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.453348] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] keystone.keyfile = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.453505] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] keystone.max_version = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.453661] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] keystone.min_version = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.453819] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] keystone.region_name = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.453972] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] keystone.service_name = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.454159] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] keystone.service_type = identity {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.454319] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] keystone.split_loggers = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.454474] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] keystone.status_code_retries = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.454665] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] keystone.status_code_retry_delay = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.454824] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] keystone.timeout = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.455014] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.455185] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] keystone.version = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.455388] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.connection_uri = {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.455575] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.cpu_mode = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.455744] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.cpu_model_extra_flags = [] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.455930] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.cpu_models = [] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.456116] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.cpu_power_governor_high = performance {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.456286] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.cpu_power_governor_low = powersave {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.456452] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.cpu_power_management = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.456623] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.456790] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.device_detach_attempts = 8 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.457021] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.device_detach_timeout = 20 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.457214] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.disk_cachemodes = [] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.457360] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.disk_prefix = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.457525] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.enabled_perf_events = [] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.457691] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.file_backed_memory = 0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.457859] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.gid_maps = [] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.458031] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.hw_disk_discard = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.458197] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.hw_machine_type = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.458373] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.images_rbd_ceph_conf = {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.458538] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.458719] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.458931] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.images_rbd_glance_store_name = {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.459121] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.images_rbd_pool = rbd {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.459298] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.images_type = default {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.459460] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.images_volume_group = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.459916] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.inject_key = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.459916] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.inject_partition = -2 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.460079] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.inject_password = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.460111] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.iscsi_iface = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.460273] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.iser_use_multipath = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.460460] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.live_migration_bandwidth = 0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.460677] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.live_migration_completion_timeout = 800 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.460884] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.live_migration_downtime = 500 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.461087] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.live_migration_downtime_delay = 75 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.461273] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.live_migration_downtime_steps = 10 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.461459] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.live_migration_inbound_addr = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.461654] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.live_migration_permit_auto_converge = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.461833] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.live_migration_permit_post_copy = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.462009] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.live_migration_scheme = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.462192] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.live_migration_timeout_action = abort {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.462363] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.live_migration_tunnelled = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.462526] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.live_migration_uri = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.462714] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.live_migration_with_native_tls = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.462897] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.max_queues = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.463116] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.mem_stats_period_seconds = 10 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.463320] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.nfs_mount_options = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.463713] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.nfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.463901] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.num_aoe_discover_tries = 3 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.464079] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.num_iser_scan_tries = 5 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.464270] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.num_memory_encrypted_guests = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.464542] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.num_nvme_discover_tries = 5 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.464834] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.num_pcie_ports = 0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.465146] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.num_volume_scan_tries = 5 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.465451] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.pmem_namespaces = [] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.465737] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.quobyte_client_cfg = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.466257] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.quobyte_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.466476] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.rbd_connect_timeout = 5 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.466658] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.466834] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.467008] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.rbd_secret_uuid = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.467178] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.rbd_user = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.467362] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.realtime_scheduler_priority = 1 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.467521] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.remote_filesystem_transport = ssh {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.467685] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.rescue_image_id = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.467848] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.rescue_kernel_id = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.468016] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.rescue_ramdisk_id = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.468196] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.rng_dev_path = /dev/urandom {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.468358] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.rx_queue_size = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.468523] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.smbfs_mount_options = {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.468806] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.smbfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.468982] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.snapshot_compression = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.469159] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.snapshot_image_format = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.469404] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.469583] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.sparse_logical_volumes = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.469756] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.swtpm_enabled = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.469928] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.swtpm_group = tss {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.470111] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.swtpm_user = tss {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.470338] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.sysinfo_serial = unique {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.470445] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.tx_queue_size = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.470612] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.uid_maps = [] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.470815] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.use_virtio_for_bridges = True {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.471041] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.virt_type = kvm {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.471248] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.volume_clear = zero {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.471429] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.volume_clear_size = 0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.471602] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.volume_use_multipath = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.471766] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.vzstorage_cache_path = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.471939] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.472125] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.vzstorage_mount_group = qemu {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.472294] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.vzstorage_mount_opts = [] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.472470] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.472745] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.472921] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.vzstorage_mount_user = stack {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.473102] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.473280] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] neutron.auth_section = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.473456] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] neutron.auth_type = password {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.473618] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] neutron.cafile = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.473804] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] neutron.certfile = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.473977] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] neutron.collect_timing = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.474156] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] neutron.connect_retries = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.474320] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] neutron.connect_retry_delay = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.474490] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] neutron.default_floating_pool = public {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.474678] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] neutron.endpoint_override = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.474847] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] neutron.extension_sync_interval = 600 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.475015] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] neutron.http_retries = 3 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.475185] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] neutron.insecure = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.475347] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] neutron.keyfile = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.475505] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] neutron.max_version = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.475678] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] neutron.metadata_proxy_shared_secret = **** {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.475835] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] neutron.min_version = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.476012] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] neutron.ovs_bridge = br-int {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.476185] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] neutron.physnets = [] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.476353] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] neutron.region_name = RegionOne {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.476520] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] neutron.service_metadata_proxy = True {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.476699] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] neutron.service_name = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.476898] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] neutron.service_type = network {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.477084] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] neutron.split_loggers = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.477246] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] neutron.status_code_retries = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.477406] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] neutron.status_code_retry_delay = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.477570] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] neutron.timeout = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.477745] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.477908] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] neutron.version = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.478092] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] notifications.bdms_in_notifications = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.478272] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] notifications.default_level = INFO {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.478448] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] notifications.notification_format = unversioned {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.478610] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] notifications.notify_on_state_change = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.478788] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.478996] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] pci.alias = [] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.479184] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] pci.device_spec = [] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.479350] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] pci.report_in_placement = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.479524] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] placement.auth_section = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.479698] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] placement.auth_type = password {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.479894] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] placement.auth_url = http://10.180.1.21/identity {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.480068] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] placement.cafile = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.480231] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] placement.certfile = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.480459] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] placement.collect_timing = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.480551] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] placement.connect_retries = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.480712] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] placement.connect_retry_delay = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.480872] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] placement.default_domain_id = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.481041] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] placement.default_domain_name = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.481206] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] placement.domain_id = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.481363] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] placement.domain_name = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.481520] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] placement.endpoint_override = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.481741] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] placement.insecure = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.481835] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] placement.keyfile = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.481990] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] placement.max_version = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.482158] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] placement.min_version = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.482328] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] placement.password = **** {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.482494] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] placement.project_domain_id = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.482729] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] placement.project_domain_name = Default {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.482982] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] placement.project_id = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.483213] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] placement.project_name = service {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.483394] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] placement.region_name = RegionOne {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.483559] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] placement.service_name = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.483730] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] placement.service_type = placement {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.483900] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] placement.split_loggers = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.484072] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] placement.status_code_retries = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.484237] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] placement.status_code_retry_delay = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.484396] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] placement.system_scope = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.484554] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] placement.timeout = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.484737] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] placement.trust_id = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.484901] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] placement.user_domain_id = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.485083] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] placement.user_domain_name = Default {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.485254] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] placement.user_id = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.485422] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] placement.username = placement {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.485623] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.485796] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] placement.version = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.485977] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] quota.cores = 20 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.486161] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] quota.count_usage_from_placement = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.486338] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.486517] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] quota.injected_file_content_bytes = 10240 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.486710] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] quota.injected_file_path_length = 255 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.486881] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] quota.injected_files = 5 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.487060] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] quota.instances = 10 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.487232] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] quota.key_pairs = 100 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.487398] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] quota.metadata_items = 128 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.487634] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] quota.ram = 51200 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.487726] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] quota.recheck_quota = True {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.487893] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] quota.server_group_members = 10 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.488076] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] quota.server_groups = 10 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.488255] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] rdp.enabled = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.488563] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.488753] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.488925] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.489103] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] scheduler.image_metadata_prefilter = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.489270] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.489437] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] scheduler.max_attempts = 3 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.489602] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] scheduler.max_placement_results = 1000 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.489803] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.489967] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] scheduler.query_placement_for_availability_zone = True {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.490143] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] scheduler.query_placement_for_image_type_support = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.490307] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.490481] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] scheduler.workers = 2 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.490655] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.490824] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.491012] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.491193] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.491362] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.491526] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.491707] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.491926] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.492102] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] filter_scheduler.host_subset_size = 1 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.492265] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] filter_scheduler.image_properties_default_architecture = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.492429] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.492595] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] filter_scheduler.isolated_hosts = [] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.492760] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] filter_scheduler.isolated_images = [] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.492923] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] filter_scheduler.max_instances_per_host = 50 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.493101] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.493268] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] filter_scheduler.pci_in_placement = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.493433] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.493595] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.493760] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.493920] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.494098] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.494280] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.494524] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] filter_scheduler.track_instance_changes = True {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.494678] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.494861] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] metrics.required = True {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.495049] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] metrics.weight_multiplier = 1.0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.495230] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] metrics.weight_of_unavailable = -10000.0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.495400] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] metrics.weight_setting = [] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.495757] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.495932] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] serial_console.enabled = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.496125] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] serial_console.port_range = 10000:20000 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.496297] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.496466] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.496672] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] serial_console.serialproxy_port = 6083 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.496854] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] service_user.auth_section = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.497039] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] service_user.auth_type = password {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.497227] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] service_user.cafile = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.497387] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] service_user.certfile = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.497550] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] service_user.collect_timing = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.497765] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] service_user.insecure = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.497901] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] service_user.keyfile = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.498086] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] service_user.send_service_user_token = True {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.498251] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] service_user.split_loggers = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.498411] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] service_user.timeout = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.498579] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] spice.agent_enabled = True {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.498783] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] spice.enabled = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.499123] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.499326] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] spice.html5proxy_host = 0.0.0.0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.499501] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] spice.html5proxy_port = 6082 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.499663] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] spice.image_compression = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.499828] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] spice.jpeg_compression = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.500046] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] spice.playback_compression = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.500233] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] spice.server_listen = 127.0.0.1 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.500404] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.500565] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] spice.streaming_mode = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.500725] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] spice.zlib_compression = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.500883] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] upgrade_levels.baseapi = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.501081] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] upgrade_levels.cert = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.501262] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] upgrade_levels.compute = auto {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.501420] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] upgrade_levels.conductor = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.501596] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] upgrade_levels.scheduler = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.501771] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vendordata_dynamic_auth.auth_section = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.501933] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vendordata_dynamic_auth.auth_type = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.502130] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vendordata_dynamic_auth.cafile = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.502293] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vendordata_dynamic_auth.certfile = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.502460] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vendordata_dynamic_auth.collect_timing = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.502617] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vendordata_dynamic_auth.insecure = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.502827] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vendordata_dynamic_auth.keyfile = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.502934] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vendordata_dynamic_auth.split_loggers = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.503102] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vendordata_dynamic_auth.timeout = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.503279] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vmware.api_retry_count = 10 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.503440] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vmware.ca_file = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.503613] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vmware.cache_prefix = devstack-image-cache {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.503809] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vmware.cluster_name = testcl1 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.503988] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vmware.connection_pool_size = 10 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.504165] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vmware.console_delay_seconds = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.504340] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vmware.datastore_regex = ^datastore.* {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.504552] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vmware.host_ip = vc1.osci.c.eu-de-1.cloud.sap {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.504795] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vmware.host_password = **** {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.504990] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vmware.host_port = 443 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.505180] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vmware.host_username = administrator@vsphere.local {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.505353] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vmware.insecure = True {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.505521] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vmware.integration_bridge = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.505681] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vmware.maximum_objects = 100 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.505841] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vmware.pbm_default_policy = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.506024] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vmware.pbm_enabled = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.506179] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vmware.pbm_wsdl_location = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.506354] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.506509] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vmware.serial_port_proxy_uri = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.506664] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vmware.serial_port_service_uri = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.506831] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vmware.task_poll_interval = 0.5 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.507009] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vmware.use_linked_clone = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.507186] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vmware.vnc_keymap = en-us {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.507353] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vmware.vnc_port = 5900 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.507518] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vmware.vnc_port_total = 10000 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.507705] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vnc.auth_schemes = ['none'] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.507888] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vnc.enabled = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.508210] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.508399] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.508574] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vnc.novncproxy_port = 6080 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.508753] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vnc.server_listen = 127.0.0.1 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.508923] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vnc.server_proxyclient_address = 127.0.0.1 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.509097] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vnc.vencrypt_ca_certs = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.509260] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vnc.vencrypt_client_cert = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.509419] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vnc.vencrypt_client_key = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.509593] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.509760] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] workarounds.disable_deep_image_inspection = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.509924] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] workarounds.disable_fallback_pcpu_query = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.510153] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] workarounds.disable_group_policy_check_upcall = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.510290] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.510452] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] workarounds.disable_rootwrap = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.510615] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] workarounds.enable_numa_live_migration = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.510833] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.510939] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.511112] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] workarounds.handle_virt_lifecycle_events = True {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.511277] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] workarounds.libvirt_disable_apic = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.511437] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] workarounds.never_download_image_if_on_rbd = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.511599] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.511762] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.511925] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.512098] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.512262] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.512424] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.512596] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.512796] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.512967] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.513165] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.513335] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] wsgi.client_socket_timeout = 900 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.513502] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] wsgi.default_pool_size = 1000 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.513672] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] wsgi.keep_alive = True {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.513842] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] wsgi.max_header_line = 16384 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.514019] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] wsgi.secure_proxy_ssl_header = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.514189] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] wsgi.ssl_ca_file = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.514346] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] wsgi.ssl_cert_file = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.514507] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] wsgi.ssl_key_file = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.514698] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] wsgi.tcp_keepidle = 600 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.514883] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.515061] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] zvm.ca_file = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.515227] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] zvm.cloud_connector_url = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.515597] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] zvm.image_tmp_path = /opt/stack/data/n-cpu-1/images {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.515725] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] zvm.reachable_timeout = 300 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.515918] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_policy.enforce_new_defaults = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.516101] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_policy.enforce_scope = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.516278] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_policy.policy_default_rule = default {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.516460] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.516634] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_policy.policy_file = policy.yaml {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.516808] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.516982] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.517146] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.517303] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.517465] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.517634] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.517810] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.518032] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] profiler.connection_string = messaging:// {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.518162] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] profiler.enabled = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.518384] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] profiler.es_doc_type = notification {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.518494] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] profiler.es_scroll_size = 10000 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.518662] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] profiler.es_scroll_time = 2m {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.518827] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] profiler.filter_error_trace = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.518992] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] profiler.hmac_keys = SECRET_KEY {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.519174] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] profiler.sentinel_service_name = mymaster {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.519343] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] profiler.socket_timeout = 0.1 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.519507] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] profiler.trace_sqlalchemy = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.519676] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] remote_debug.host = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.519838] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] remote_debug.port = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.520014] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.520185] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.520351] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.520656] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.520687] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.520850] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.521021] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.521190] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.521386] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.521505] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.521677] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.521842] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.522026] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.522189] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.522356] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.522535] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.522696] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.522859] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.523049] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.523201] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.523363] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.523528] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.523689] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.523857] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.524030] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.524203] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_messaging_rabbit.ssl = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.524376] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.524546] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.524737] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.524915] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.525105] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_messaging_rabbit.ssl_version = {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.525295] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.525465] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_messaging_notifications.retry = -1 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.525682] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.526073] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_messaging_notifications.transport_url = **** {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.526265] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_limit.auth_section = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.526440] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_limit.auth_type = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.526598] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_limit.cafile = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.526761] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_limit.certfile = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.526926] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_limit.collect_timing = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.527099] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_limit.connect_retries = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.527261] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_limit.connect_retry_delay = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.527420] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_limit.endpoint_id = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.527580] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_limit.endpoint_override = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.527793] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_limit.insecure = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.527902] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_limit.keyfile = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.528070] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_limit.max_version = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.528233] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_limit.min_version = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.528387] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_limit.region_name = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.528542] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_limit.service_name = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.528700] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_limit.service_type = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.528864] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_limit.split_loggers = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.529873] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_limit.status_code_retries = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.529873] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_limit.status_code_retry_delay = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.529873] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_limit.timeout = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.529873] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_limit.valid_interfaces = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.529873] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_limit.version = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.529873] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_reports.file_event_handler = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.530161] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_reports.file_event_handler_interval = 1 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.530161] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] oslo_reports.log_dir = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.530294] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.530441] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vif_plug_linux_bridge_privileged.group = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.530596] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.530757] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.530915] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vif_plug_linux_bridge_privileged.thread_pool_size = 8 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.531576] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vif_plug_linux_bridge_privileged.user = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.531576] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.531576] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vif_plug_ovs_privileged.group = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.531576] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vif_plug_ovs_privileged.helper_command = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.531732] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.531867] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vif_plug_ovs_privileged.thread_pool_size = 8 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.532060] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] vif_plug_ovs_privileged.user = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.532227] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] os_vif_linux_bridge.flat_interface = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.532520] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.532621] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.532856] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.532948] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.533065] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.533443] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.533443] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] os_vif_linux_bridge.vlan_interface = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.533556] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] os_vif_ovs.isolate_vif = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.533863] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.533908] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.534045] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.534315] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] os_vif_ovs.ovsdb_interface = native {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.534381] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] os_vif_ovs.per_port_bridge = False {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.536073] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] os_brick.lock_path = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.536073] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] privsep_osbrick.capabilities = [21] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.536073] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] privsep_osbrick.group = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.536073] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] privsep_osbrick.helper_command = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.536073] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.536073] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] privsep_osbrick.thread_pool_size = 8 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.536073] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] privsep_osbrick.user = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.536343] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.536343] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] nova_sys_admin.group = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.536343] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] nova_sys_admin.helper_command = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.536343] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.536343] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] nova_sys_admin.thread_pool_size = 8 {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.536476] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] nova_sys_admin.user = None {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 570.536589] env[59857]: DEBUG oslo_service.service [None req-9a9115fe-b88d-4514-8432-8efcfc1535d6 None None] ******************************************************************************** {{(pid=59857) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2613}} [ 570.538830] env[59857]: INFO nova.service [-] Starting compute node (version 0.1.0) [ 570.546924] env[59857]: INFO nova.virt.node [None req-6f9925a7-7dda-43d4-b7e2-380e23a79b6d None None] Generated node identity 80c650ad-13a5-4d4e-96b3-a14b31abfa11 [ 570.547650] env[59857]: INFO nova.virt.node [None req-6f9925a7-7dda-43d4-b7e2-380e23a79b6d None None] Wrote node identity 80c650ad-13a5-4d4e-96b3-a14b31abfa11 to /opt/stack/data/n-cpu-1/compute_id [ 570.558552] env[59857]: WARNING nova.compute.manager [None req-6f9925a7-7dda-43d4-b7e2-380e23a79b6d None None] Compute nodes ['80c650ad-13a5-4d4e-96b3-a14b31abfa11'] for host cpu-1 were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. [ 570.598401] env[59857]: INFO nova.compute.manager [None req-6f9925a7-7dda-43d4-b7e2-380e23a79b6d None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host [ 570.621809] env[59857]: WARNING nova.compute.manager [None req-6f9925a7-7dda-43d4-b7e2-380e23a79b6d None None] No compute node record found for host cpu-1. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host cpu-1 could not be found. [ 570.622208] env[59857]: DEBUG oslo_concurrency.lockutils [None req-6f9925a7-7dda-43d4-b7e2-380e23a79b6d None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 570.622441] env[59857]: DEBUG oslo_concurrency.lockutils [None req-6f9925a7-7dda-43d4-b7e2-380e23a79b6d None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 570.622626] env[59857]: DEBUG oslo_concurrency.lockutils [None req-6f9925a7-7dda-43d4-b7e2-380e23a79b6d None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 570.622791] env[59857]: DEBUG nova.compute.resource_tracker [None req-6f9925a7-7dda-43d4-b7e2-380e23a79b6d None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59857) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 570.624093] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7be887e-42bb-400f-a65e-241488addc67 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 570.633480] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f53db908-86b9-4344-a579-ba9b1b59b593 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 570.648917] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0361ba72-b651-475a-8868-b8d73480cb43 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 570.655637] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99a3381c-f11a-46ac-a3f5-2790ac2ec868 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 570.689026] env[59857]: DEBUG nova.compute.resource_tracker [None req-6f9925a7-7dda-43d4-b7e2-380e23a79b6d None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181528MB free_disk=154GB free_vcpus=48 pci_devices=None {{(pid=59857) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 570.689026] env[59857]: DEBUG oslo_concurrency.lockutils [None req-6f9925a7-7dda-43d4-b7e2-380e23a79b6d None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 570.689026] env[59857]: DEBUG oslo_concurrency.lockutils [None req-6f9925a7-7dda-43d4-b7e2-380e23a79b6d None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 570.698974] env[59857]: WARNING nova.compute.resource_tracker [None req-6f9925a7-7dda-43d4-b7e2-380e23a79b6d None None] No compute node record for cpu-1:80c650ad-13a5-4d4e-96b3-a14b31abfa11: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 80c650ad-13a5-4d4e-96b3-a14b31abfa11 could not be found. [ 570.711868] env[59857]: INFO nova.compute.resource_tracker [None req-6f9925a7-7dda-43d4-b7e2-380e23a79b6d None None] Compute node record created for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 with uuid: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 [ 570.762409] env[59857]: DEBUG nova.compute.resource_tracker [None req-6f9925a7-7dda-43d4-b7e2-380e23a79b6d None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=59857) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 570.762619] env[59857]: DEBUG nova.compute.resource_tracker [None req-6f9925a7-7dda-43d4-b7e2-380e23a79b6d None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=59857) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 570.864548] env[59857]: INFO nova.scheduler.client.report [None req-6f9925a7-7dda-43d4-b7e2-380e23a79b6d None None] [req-1f7e3173-15c5-4118-bca2-75e40c75c5c9] Created resource provider record via placement API for resource provider with UUID 80c650ad-13a5-4d4e-96b3-a14b31abfa11 and name domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28. [ 570.880463] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bdefee3a-ed5a-42b1-bc1b-3b531cb6f556 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 570.888216] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d10dc32c-8491-42f7-a38d-73ded9800307 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 570.917365] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d2659ae-905a-460a-b45e-ff3896b44b03 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 570.923837] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3d531a9-5750-4402-8a6c-abe8839ae3e5 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 570.936586] env[59857]: DEBUG nova.compute.provider_tree [None req-6f9925a7-7dda-43d4-b7e2-380e23a79b6d None None] Updating inventory in ProviderTree for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 570.972271] env[59857]: DEBUG nova.scheduler.client.report [None req-6f9925a7-7dda-43d4-b7e2-380e23a79b6d None None] Updated inventory for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 with generation 0 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 570.972436] env[59857]: DEBUG nova.compute.provider_tree [None req-6f9925a7-7dda-43d4-b7e2-380e23a79b6d None None] Updating resource provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 generation from 0 to 1 during operation: update_inventory {{(pid=59857) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 570.972602] env[59857]: DEBUG nova.compute.provider_tree [None req-6f9925a7-7dda-43d4-b7e2-380e23a79b6d None None] Updating inventory in ProviderTree for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 571.014139] env[59857]: DEBUG nova.compute.provider_tree [None req-6f9925a7-7dda-43d4-b7e2-380e23a79b6d None None] Updating resource provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 generation from 1 to 2 during operation: update_traits {{(pid=59857) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 571.032401] env[59857]: DEBUG nova.compute.resource_tracker [None req-6f9925a7-7dda-43d4-b7e2-380e23a79b6d None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59857) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 571.032592] env[59857]: DEBUG oslo_concurrency.lockutils [None req-6f9925a7-7dda-43d4-b7e2-380e23a79b6d None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.346s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 571.032757] env[59857]: DEBUG nova.service [None req-6f9925a7-7dda-43d4-b7e2-380e23a79b6d None None] Creating RPC server for service compute {{(pid=59857) start /opt/stack/nova/nova/service.py:182}} [ 571.045692] env[59857]: DEBUG nova.service [None req-6f9925a7-7dda-43d4-b7e2-380e23a79b6d None None] Join ServiceGroup membership for this service compute {{(pid=59857) start /opt/stack/nova/nova/service.py:199}} [ 571.045875] env[59857]: DEBUG nova.servicegroup.drivers.db [None req-6f9925a7-7dda-43d4-b7e2-380e23a79b6d None None] DB_Driver: join new ServiceGroup member cpu-1 to the compute group, service = {{(pid=59857) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 590.047742] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager._sync_power_states {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 590.058418] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Getting list of instances from cluster (obj){ [ 590.058418] env[59857]: value = "domain-c8" [ 590.058418] env[59857]: _type = "ClusterComputeResource" [ 590.058418] env[59857]: } {{(pid=59857) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 590.059558] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12835d71-1abc-47a4-a0d9-3b9570a76a1b {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 590.068415] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Got total of 0 instances {{(pid=59857) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 590.068623] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 590.068933] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Getting list of instances from cluster (obj){ [ 590.068933] env[59857]: value = "domain-c8" [ 590.068933] env[59857]: _type = "ClusterComputeResource" [ 590.068933] env[59857]: } {{(pid=59857) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 590.069776] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-181b5f4e-639b-4832-837a-c9a90225729f {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 590.077070] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Got total of 0 instances {{(pid=59857) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 610.875956] env[59857]: DEBUG oslo_concurrency.lockutils [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Acquiring lock "31aed7b3-ea4a-4db4-b919-5d754f4c3b17" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.876331] env[59857]: DEBUG oslo_concurrency.lockutils [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Lock "31aed7b3-ea4a-4db4-b919-5d754f4c3b17" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.891561] env[59857]: DEBUG nova.compute.manager [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] Starting instance... {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 610.987487] env[59857]: DEBUG oslo_concurrency.lockutils [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.987735] env[59857]: DEBUG oslo_concurrency.lockutils [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.989445] env[59857]: INFO nova.compute.claims [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 611.138211] env[59857]: DEBUG oslo_concurrency.lockutils [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Acquiring lock "de589259-86c5-4830-8507-2de7ad76c034" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 611.139184] env[59857]: DEBUG oslo_concurrency.lockutils [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Lock "de589259-86c5-4830-8507-2de7ad76c034" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 611.155698] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2cabc3a-0ec1-4e52-ab68-738e5f97e98f {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 611.159806] env[59857]: DEBUG nova.compute.manager [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] [instance: de589259-86c5-4830-8507-2de7ad76c034] Starting instance... {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 611.167842] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3ff34bd-c178-44fc-933c-4b4791cef3db {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 611.214846] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9403bc14-308d-42ee-8f0e-a8f5f887d4c3 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 611.225931] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9bb0295b-79ac-44b5-957f-70bc7345374b {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 611.235414] env[59857]: DEBUG oslo_concurrency.lockutils [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 611.245628] env[59857]: DEBUG nova.compute.provider_tree [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 611.253885] env[59857]: DEBUG nova.scheduler.client.report [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 611.280275] env[59857]: DEBUG oslo_concurrency.lockutils [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.292s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 611.281541] env[59857]: DEBUG nova.compute.manager [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] Start building networks asynchronously for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 611.284500] env[59857]: DEBUG oslo_concurrency.lockutils [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.051s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 611.286227] env[59857]: INFO nova.compute.claims [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] [instance: de589259-86c5-4830-8507-2de7ad76c034] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 611.338958] env[59857]: DEBUG nova.compute.utils [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Using /dev/sd instead of None {{(pid=59857) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 611.340252] env[59857]: DEBUG nova.compute.manager [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] Allocating IP information in the background. {{(pid=59857) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 611.340482] env[59857]: DEBUG nova.network.neutron [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] allocate_for_instance() {{(pid=59857) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 611.352932] env[59857]: DEBUG nova.compute.manager [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] Start building block device mappings for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 611.404640] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-195cf39d-b52a-48be-aa91-4dad9ffdf814 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 611.414574] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e301eab-b536-44a7-9a13-07e07f3b7c26 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 611.452535] env[59857]: DEBUG nova.compute.manager [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] Start spawning the instance on the hypervisor. {{(pid=59857) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 611.454467] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00e8d219-9338-4aaa-ab11-7d1132c8df67 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 611.467217] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3970d684-c19e-4762-a7f9-2d97003d84a1 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 611.479745] env[59857]: DEBUG nova.compute.provider_tree [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 611.492156] env[59857]: DEBUG nova.scheduler.client.report [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 611.508011] env[59857]: DEBUG oslo_concurrency.lockutils [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.223s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 611.508599] env[59857]: DEBUG nova.compute.manager [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] [instance: de589259-86c5-4830-8507-2de7ad76c034] Start building networks asynchronously for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 611.550007] env[59857]: DEBUG nova.compute.utils [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Using /dev/sd instead of None {{(pid=59857) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 611.552299] env[59857]: DEBUG nova.compute.manager [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] [instance: de589259-86c5-4830-8507-2de7ad76c034] Allocating IP information in the background. {{(pid=59857) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 611.552766] env[59857]: DEBUG nova.network.neutron [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] [instance: de589259-86c5-4830-8507-2de7ad76c034] allocate_for_instance() {{(pid=59857) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 611.565320] env[59857]: DEBUG nova.compute.manager [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] [instance: de589259-86c5-4830-8507-2de7ad76c034] Start building block device mappings for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 611.669571] env[59857]: DEBUG nova.virt.hardware [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-23T14:43:14Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-23T14:42:57Z,direct_url=,disk_format='vmdk',id=4a4a4830-1ff7-4cff-ab75-d665942f46b5,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='88755ffc9d394d14928f0692771e61f5',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-23T14:42:58Z,virtual_size=,visibility=), allow threads: False {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 611.673021] env[59857]: DEBUG nova.virt.hardware [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Flavor limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 611.673021] env[59857]: DEBUG nova.virt.hardware [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Image limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 611.673021] env[59857]: DEBUG nova.virt.hardware [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Flavor pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 611.673021] env[59857]: DEBUG nova.virt.hardware [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Image pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 611.673021] env[59857]: DEBUG nova.virt.hardware [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 611.673669] env[59857]: DEBUG nova.virt.hardware [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 611.673669] env[59857]: DEBUG nova.virt.hardware [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 611.673669] env[59857]: DEBUG nova.virt.hardware [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Got 1 possible topologies {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 611.673669] env[59857]: DEBUG nova.virt.hardware [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 611.673669] env[59857]: DEBUG nova.virt.hardware [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 611.673958] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24310794-d1c6-41f2-a09b-d58954232f7b {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 611.681427] env[59857]: DEBUG nova.compute.manager [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] [instance: de589259-86c5-4830-8507-2de7ad76c034] Start spawning the instance on the hypervisor. {{(pid=59857) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 611.697878] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81706f25-dc7e-4915-a365-37a44c68bbf0 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 611.722048] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e0b1eea-26b8-49d8-b1fc-f0aebc3962c6 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 611.742760] env[59857]: DEBUG nova.virt.hardware [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-23T14:43:14Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-23T14:42:57Z,direct_url=,disk_format='vmdk',id=4a4a4830-1ff7-4cff-ab75-d665942f46b5,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='88755ffc9d394d14928f0692771e61f5',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-23T14:42:58Z,virtual_size=,visibility=), allow threads: False {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 611.743297] env[59857]: DEBUG nova.virt.hardware [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Flavor limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 611.743297] env[59857]: DEBUG nova.virt.hardware [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Image limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 611.743563] env[59857]: DEBUG nova.virt.hardware [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Flavor pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 611.743563] env[59857]: DEBUG nova.virt.hardware [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Image pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 611.743704] env[59857]: DEBUG nova.virt.hardware [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 611.743934] env[59857]: DEBUG nova.virt.hardware [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 611.744098] env[59857]: DEBUG nova.virt.hardware [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 611.744257] env[59857]: DEBUG nova.virt.hardware [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Got 1 possible topologies {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 611.744412] env[59857]: DEBUG nova.virt.hardware [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 611.744572] env[59857]: DEBUG nova.virt.hardware [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 611.745459] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b22fc69-4c27-414e-9a22-88800626eea7 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 611.753903] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ffd23c55-bfa3-4517-b96f-6fdb10facedd {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 611.870391] env[59857]: DEBUG nova.policy [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dd051ff4b35d4acfa361115f90e620a3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '127a3eb5002944c5a51c17c72f860bca', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59857) authorize /opt/stack/nova/nova/policy.py:203}} [ 611.875886] env[59857]: DEBUG nova.policy [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '888b9b022a5449a882fe7877924d1a02', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '72a0c826169d4687ab1a83684f443d9a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59857) authorize /opt/stack/nova/nova/policy.py:203}} [ 611.951647] env[59857]: DEBUG oslo_concurrency.lockutils [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Acquiring lock "21e37459-3ce1-41e7-8317-b98edafb15d4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 611.952328] env[59857]: DEBUG oslo_concurrency.lockutils [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Lock "21e37459-3ce1-41e7-8317-b98edafb15d4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 611.969639] env[59857]: DEBUG nova.compute.manager [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] Starting instance... {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 612.024109] env[59857]: DEBUG oslo_concurrency.lockutils [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 612.024355] env[59857]: DEBUG oslo_concurrency.lockutils [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 612.026111] env[59857]: INFO nova.compute.claims [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 612.158048] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da6f22fa-5e68-4b59-9e01-4e194ffe444a {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 612.166294] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e19867b0-79da-46d1-a32f-85012a600aa2 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 612.200501] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-466b48e7-cfe7-4048-ab8b-7bbd94924e65 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 612.207898] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-705738f2-1d86-4ec6-ac3b-14fe0c2a2160 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 612.223359] env[59857]: DEBUG nova.compute.provider_tree [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 612.232197] env[59857]: DEBUG nova.scheduler.client.report [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 612.249055] env[59857]: DEBUG oslo_concurrency.lockutils [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.223s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 612.249055] env[59857]: DEBUG nova.compute.manager [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] Start building networks asynchronously for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 612.287044] env[59857]: DEBUG nova.compute.utils [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Using /dev/sd instead of None {{(pid=59857) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 612.288423] env[59857]: DEBUG nova.compute.manager [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] Not allocating networking since 'none' was specified. {{(pid=59857) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 612.300043] env[59857]: DEBUG nova.compute.manager [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] Start building block device mappings for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 612.381455] env[59857]: DEBUG nova.compute.manager [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] Start spawning the instance on the hypervisor. {{(pid=59857) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 612.408466] env[59857]: DEBUG nova.virt.hardware [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-23T14:43:14Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-23T14:42:57Z,direct_url=,disk_format='vmdk',id=4a4a4830-1ff7-4cff-ab75-d665942f46b5,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='88755ffc9d394d14928f0692771e61f5',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-23T14:42:58Z,virtual_size=,visibility=), allow threads: False {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 612.408747] env[59857]: DEBUG nova.virt.hardware [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Flavor limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 612.408900] env[59857]: DEBUG nova.virt.hardware [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Image limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 612.409083] env[59857]: DEBUG nova.virt.hardware [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Flavor pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 612.409224] env[59857]: DEBUG nova.virt.hardware [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Image pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 612.409364] env[59857]: DEBUG nova.virt.hardware [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 612.410033] env[59857]: DEBUG nova.virt.hardware [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 612.410033] env[59857]: DEBUG nova.virt.hardware [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 612.410130] env[59857]: DEBUG nova.virt.hardware [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Got 1 possible topologies {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 612.410214] env[59857]: DEBUG nova.virt.hardware [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 612.410381] env[59857]: DEBUG nova.virt.hardware [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 612.411451] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8029b221-69ed-4005-af7a-cd4dcab1b19f {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 612.420171] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43c7e7d8-f4ad-47e9-a021-1aa7ecc9f1f2 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 612.437180] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] Instance VIF info [] {{(pid=59857) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 612.448571] env[59857]: DEBUG nova.virt.vmwareapi.vm_util [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Creating folder: OpenStack. Parent ref: group-v4. {{(pid=59857) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 612.448846] env[59857]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-45a0900a-2c83-4c3e-920b-8cc5d169e33c {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 612.459673] env[59857]: INFO nova.virt.vmwareapi.vm_util [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Created folder: OpenStack in parent group-v4. [ 612.459871] env[59857]: DEBUG nova.virt.vmwareapi.vm_util [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Creating folder: Project (a46c26434cd94ea2bcea9461abb7f359). Parent ref: group-v286134. {{(pid=59857) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 612.460092] env[59857]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d67da3a9-44cc-42cd-abd3-839a886abcad {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 612.470509] env[59857]: INFO nova.virt.vmwareapi.vm_util [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Created folder: Project (a46c26434cd94ea2bcea9461abb7f359) in parent group-v286134. [ 612.470594] env[59857]: DEBUG nova.virt.vmwareapi.vm_util [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Creating folder: Instances. Parent ref: group-v286135. {{(pid=59857) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 612.470794] env[59857]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-dde8a5de-acca-4d0d-807e-8c69500d5306 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 612.479679] env[59857]: INFO nova.virt.vmwareapi.vm_util [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Created folder: Instances in parent group-v286135. [ 612.479920] env[59857]: DEBUG oslo.service.loopingcall [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59857) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 612.480115] env[59857]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] Creating VM on the ESX host {{(pid=59857) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 612.480298] env[59857]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-4c876202-6d4e-420c-8c7e-1ae7d581b20e {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 612.498458] env[59857]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 612.498458] env[59857]: value = "task-1341415" [ 612.498458] env[59857]: _type = "Task" [ 612.498458] env[59857]: } to complete. {{(pid=59857) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 612.506747] env[59857]: DEBUG oslo_vmware.api [-] Task: {'id': task-1341415, 'name': CreateVM_Task} progress is 0%. {{(pid=59857) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 612.926680] env[59857]: DEBUG nova.network.neutron [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] [instance: de589259-86c5-4830-8507-2de7ad76c034] Successfully created port: 3f73c04b-d229-4bef-90ed-9080b13ee00b {{(pid=59857) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 612.945531] env[59857]: DEBUG nova.network.neutron [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] Successfully created port: 4afc9255-bbe7-49d4-a0f3-47b43a0aa5b8 {{(pid=59857) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 613.012868] env[59857]: DEBUG oslo_vmware.api [-] Task: {'id': task-1341415, 'name': CreateVM_Task, 'duration_secs': 0.300139} completed successfully. {{(pid=59857) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 613.013136] env[59857]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] Created VM on the ESX host {{(pid=59857) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 613.014085] env[59857]: DEBUG oslo_vmware.service [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b75a91b-b16c-4e86-9941-552bffe395cb {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 613.020837] env[59857]: DEBUG oslo_concurrency.lockutils [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/4a4a4830-1ff7-4cff-ab75-d665942f46b5" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 613.020966] env[59857]: DEBUG oslo_concurrency.lockutils [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Acquired lock "[datastore2] devstack-image-cache_base/4a4a4830-1ff7-4cff-ab75-d665942f46b5" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 613.021636] env[59857]: DEBUG oslo_concurrency.lockutils [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/4a4a4830-1ff7-4cff-ab75-d665942f46b5" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 613.021883] env[59857]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-91898efc-6129-4dbe-839a-2afbeeec4530 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 613.030745] env[59857]: DEBUG oslo_vmware.api [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Waiting for the task: (returnval){ [ 613.030745] env[59857]: value = "session[5271f050-9b9a-180d-7d1b-498ca49bf02c]5219b4a6-9c08-2c26-ca1b-93213016afcb" [ 613.030745] env[59857]: _type = "Task" [ 613.030745] env[59857]: } to complete. {{(pid=59857) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 613.041490] env[59857]: DEBUG oslo_vmware.api [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Task: {'id': session[5271f050-9b9a-180d-7d1b-498ca49bf02c]5219b4a6-9c08-2c26-ca1b-93213016afcb, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59857) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 613.544207] env[59857]: DEBUG oslo_concurrency.lockutils [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Releasing lock "[datastore2] devstack-image-cache_base/4a4a4830-1ff7-4cff-ab75-d665942f46b5" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 613.544473] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] Processing image 4a4a4830-1ff7-4cff-ab75-d665942f46b5 {{(pid=59857) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 613.544692] env[59857]: DEBUG oslo_concurrency.lockutils [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/4a4a4830-1ff7-4cff-ab75-d665942f46b5/4a4a4830-1ff7-4cff-ab75-d665942f46b5.vmdk" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 613.544866] env[59857]: DEBUG oslo_concurrency.lockutils [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Acquired lock "[datastore2] devstack-image-cache_base/4a4a4830-1ff7-4cff-ab75-d665942f46b5/4a4a4830-1ff7-4cff-ab75-d665942f46b5.vmdk" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 613.545434] env[59857]: DEBUG nova.virt.vmwareapi.ds_util [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=59857) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 613.545686] env[59857]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-bd34c28f-18dc-4c62-8409-3bc41397e9d2 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 613.565239] env[59857]: DEBUG nova.virt.vmwareapi.ds_util [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=59857) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 613.565239] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=59857) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 613.565452] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af7b998d-4393-42bc-b226-1d442dae29c5 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 613.576724] env[59857]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d8326cb4-9b77-415b-b9bd-5f8c2df79cdc {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 613.583463] env[59857]: DEBUG oslo_vmware.api [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Waiting for the task: (returnval){ [ 613.583463] env[59857]: value = "session[5271f050-9b9a-180d-7d1b-498ca49bf02c]52f86338-3869-b745-ab87-7711e642ad34" [ 613.583463] env[59857]: _type = "Task" [ 613.583463] env[59857]: } to complete. {{(pid=59857) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 613.596738] env[59857]: DEBUG oslo_vmware.api [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Task: {'id': session[5271f050-9b9a-180d-7d1b-498ca49bf02c]52f86338-3869-b745-ab87-7711e642ad34, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59857) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 614.101445] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] Preparing fetch location {{(pid=59857) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 614.101733] env[59857]: DEBUG nova.virt.vmwareapi.ds_util [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Creating directory with path [datastore2] vmware_temp/ead15260-9aef-4ff4-beda-8a9ffaa2f1bc/4a4a4830-1ff7-4cff-ab75-d665942f46b5 {{(pid=59857) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 614.101844] env[59857]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d9233d27-370c-4513-a5f9-f463027f9f50 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 614.135303] env[59857]: DEBUG nova.virt.vmwareapi.ds_util [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Created directory with path [datastore2] vmware_temp/ead15260-9aef-4ff4-beda-8a9ffaa2f1bc/4a4a4830-1ff7-4cff-ab75-d665942f46b5 {{(pid=59857) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 614.135519] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] Fetch image to [datastore2] vmware_temp/ead15260-9aef-4ff4-beda-8a9ffaa2f1bc/4a4a4830-1ff7-4cff-ab75-d665942f46b5/tmp-sparse.vmdk {{(pid=59857) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 614.135676] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] Downloading image file data 4a4a4830-1ff7-4cff-ab75-d665942f46b5 to [datastore2] vmware_temp/ead15260-9aef-4ff4-beda-8a9ffaa2f1bc/4a4a4830-1ff7-4cff-ab75-d665942f46b5/tmp-sparse.vmdk on the data store datastore2 {{(pid=59857) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 614.138912] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5560de08-9a66-4c09-9c9c-0370be4fff2a {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 614.149840] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d3e19bd-8d55-4529-998b-c55d1cafe9db {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 614.161594] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-255a0072-2fcf-4242-9f3d-591b032762bc {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 614.199928] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c2597dc-74d8-4bf2-b22c-62857c2f49b4 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 614.205229] env[59857]: DEBUG oslo_concurrency.lockutils [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Acquiring lock "17c89372-97f2-4ffa-a13e-606d4f31b08f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 614.205450] env[59857]: DEBUG oslo_concurrency.lockutils [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Lock "17c89372-97f2-4ffa-a13e-606d4f31b08f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 614.212185] env[59857]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-0a1c11be-de46-4e67-b0b2-c84b39b42af5 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 614.221865] env[59857]: DEBUG nova.compute.manager [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] Starting instance... {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 614.290682] env[59857]: DEBUG oslo_concurrency.lockutils [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 614.291119] env[59857]: DEBUG oslo_concurrency.lockutils [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 614.292802] env[59857]: INFO nova.compute.claims [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 614.301991] env[59857]: DEBUG nova.virt.vmwareapi.images [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] Downloading image file data 4a4a4830-1ff7-4cff-ab75-d665942f46b5 to the data store datastore2 {{(pid=59857) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 614.401425] env[59857]: DEBUG oslo_vmware.rw_handles [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ead15260-9aef-4ff4-beda-8a9ffaa2f1bc/4a4a4830-1ff7-4cff-ab75-d665942f46b5/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59857) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 614.472515] env[59857]: DEBUG oslo_vmware.rw_handles [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Completed reading data from the image iterator. {{(pid=59857) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 614.472683] env[59857]: DEBUG oslo_vmware.rw_handles [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ead15260-9aef-4ff4-beda-8a9ffaa2f1bc/4a4a4830-1ff7-4cff-ab75-d665942f46b5/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59857) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 614.528786] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78a96b1f-e031-410a-9e1e-40c0bb70a935 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 614.538660] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71c54368-43f9-4271-9b94-d492defadf1b {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 614.572792] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d27c8cac-7437-4ae2-b8ab-491ea073182b {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 614.580409] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5faa294-caaf-4c89-a10d-d8b8620ab761 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 614.594171] env[59857]: DEBUG nova.compute.provider_tree [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 614.607852] env[59857]: DEBUG nova.scheduler.client.report [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 614.623866] env[59857]: DEBUG oslo_concurrency.lockutils [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.332s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 614.624267] env[59857]: DEBUG nova.compute.manager [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] Start building networks asynchronously for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 614.667624] env[59857]: DEBUG nova.compute.utils [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Using /dev/sd instead of None {{(pid=59857) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 614.668885] env[59857]: DEBUG nova.compute.manager [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] Allocating IP information in the background. {{(pid=59857) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 614.669172] env[59857]: DEBUG nova.network.neutron [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] allocate_for_instance() {{(pid=59857) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 614.684097] env[59857]: DEBUG nova.compute.manager [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] Start building block device mappings for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 614.768061] env[59857]: DEBUG nova.compute.manager [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] Start spawning the instance on the hypervisor. {{(pid=59857) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 614.797420] env[59857]: DEBUG nova.virt.hardware [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-23T14:43:14Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-23T14:42:57Z,direct_url=,disk_format='vmdk',id=4a4a4830-1ff7-4cff-ab75-d665942f46b5,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='88755ffc9d394d14928f0692771e61f5',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-23T14:42:58Z,virtual_size=,visibility=), allow threads: False {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 614.797713] env[59857]: DEBUG nova.virt.hardware [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Flavor limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 614.797864] env[59857]: DEBUG nova.virt.hardware [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Image limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 614.798138] env[59857]: DEBUG nova.virt.hardware [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Flavor pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 614.798343] env[59857]: DEBUG nova.virt.hardware [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Image pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 614.798530] env[59857]: DEBUG nova.virt.hardware [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 614.798778] env[59857]: DEBUG nova.virt.hardware [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 614.798973] env[59857]: DEBUG nova.virt.hardware [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 614.799190] env[59857]: DEBUG nova.virt.hardware [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Got 1 possible topologies {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 614.799392] env[59857]: DEBUG nova.virt.hardware [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 614.799601] env[59857]: DEBUG nova.virt.hardware [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 614.800943] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc0d6e42-fd02-44ca-833a-d263e025bf89 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 614.812581] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f853f8fc-16f5-48af-97cb-7963d741c567 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 615.058877] env[59857]: DEBUG nova.policy [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c7860a3233494e459d7e7202299108a6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c1f3e5c6ab1a4a5baf037e03c6f22934', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59857) authorize /opt/stack/nova/nova/policy.py:203}} [ 616.474614] env[59857]: DEBUG nova.network.neutron [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] Successfully created port: 717ab373-c7e5-4f17-99ea-5059dde6cc33 {{(pid=59857) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 618.429151] env[59857]: ERROR nova.compute.manager [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 4afc9255-bbe7-49d4-a0f3-47b43a0aa5b8, please check neutron logs for more information. [ 618.429151] env[59857]: ERROR nova.compute.manager Traceback (most recent call last): [ 618.429151] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 618.429151] env[59857]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 618.429151] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 618.429151] env[59857]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 618.429151] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 618.429151] env[59857]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 618.429151] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 618.429151] env[59857]: ERROR nova.compute.manager self.force_reraise() [ 618.429151] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 618.429151] env[59857]: ERROR nova.compute.manager raise self.value [ 618.429151] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 618.429151] env[59857]: ERROR nova.compute.manager updated_port = self._update_port( [ 618.429151] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 618.429151] env[59857]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 618.429821] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 618.429821] env[59857]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 618.429821] env[59857]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 4afc9255-bbe7-49d4-a0f3-47b43a0aa5b8, please check neutron logs for more information. [ 618.429821] env[59857]: ERROR nova.compute.manager [ 618.429963] env[59857]: Traceback (most recent call last): [ 618.432758] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 618.432758] env[59857]: listener.cb(fileno) [ 618.432758] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 618.432758] env[59857]: result = function(*args, **kwargs) [ 618.432758] env[59857]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 618.432758] env[59857]: return func(*args, **kwargs) [ 618.432758] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 618.432758] env[59857]: raise e [ 618.432758] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 618.432758] env[59857]: nwinfo = self.network_api.allocate_for_instance( [ 618.432758] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 618.432758] env[59857]: created_port_ids = self._update_ports_for_instance( [ 618.432758] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 618.432758] env[59857]: with excutils.save_and_reraise_exception(): [ 618.432758] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 618.432758] env[59857]: self.force_reraise() [ 618.432758] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 618.432758] env[59857]: raise self.value [ 618.432758] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 618.432758] env[59857]: updated_port = self._update_port( [ 618.432758] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 618.432758] env[59857]: _ensure_no_port_binding_failure(port) [ 618.432758] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 618.432758] env[59857]: raise exception.PortBindingFailed(port_id=port['id']) [ 618.432758] env[59857]: nova.exception.PortBindingFailed: Binding failed for port 4afc9255-bbe7-49d4-a0f3-47b43a0aa5b8, please check neutron logs for more information. [ 618.432758] env[59857]: Removing descriptor: 14 [ 618.433615] env[59857]: ERROR nova.compute.manager [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 4afc9255-bbe7-49d4-a0f3-47b43a0aa5b8, please check neutron logs for more information. [ 618.433615] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] Traceback (most recent call last): [ 618.433615] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 618.433615] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] yield resources [ 618.433615] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 618.433615] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] self.driver.spawn(context, instance, image_meta, [ 618.433615] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 618.433615] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] self._vmops.spawn(context, instance, image_meta, injected_files, [ 618.433615] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 618.433615] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] vm_ref = self.build_virtual_machine(instance, [ 618.433615] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 618.433872] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] vif_infos = vmwarevif.get_vif_info(self._session, [ 618.433872] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 618.433872] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] for vif in network_info: [ 618.433872] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 618.433872] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] return self._sync_wrapper(fn, *args, **kwargs) [ 618.433872] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 618.433872] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] self.wait() [ 618.433872] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 618.433872] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] self[:] = self._gt.wait() [ 618.433872] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 618.433872] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] return self._exit_event.wait() [ 618.433872] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 618.433872] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] result = hub.switch() [ 618.433872] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 618.434203] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] return self.greenlet.switch() [ 618.434203] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 618.434203] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] result = function(*args, **kwargs) [ 618.434203] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 618.434203] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] return func(*args, **kwargs) [ 618.434203] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 618.434203] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] raise e [ 618.434203] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 618.434203] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] nwinfo = self.network_api.allocate_for_instance( [ 618.434203] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 618.434203] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] created_port_ids = self._update_ports_for_instance( [ 618.434203] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 618.434203] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] with excutils.save_and_reraise_exception(): [ 618.434510] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 618.434510] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] self.force_reraise() [ 618.434510] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 618.434510] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] raise self.value [ 618.434510] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 618.434510] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] updated_port = self._update_port( [ 618.434510] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 618.434510] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] _ensure_no_port_binding_failure(port) [ 618.434510] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 618.434510] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] raise exception.PortBindingFailed(port_id=port['id']) [ 618.434510] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] nova.exception.PortBindingFailed: Binding failed for port 4afc9255-bbe7-49d4-a0f3-47b43a0aa5b8, please check neutron logs for more information. [ 618.434510] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] [ 618.434811] env[59857]: INFO nova.compute.manager [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] Terminating instance [ 618.436595] env[59857]: DEBUG oslo_concurrency.lockutils [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Acquiring lock "refresh_cache-31aed7b3-ea4a-4db4-b919-5d754f4c3b17" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 618.437309] env[59857]: DEBUG oslo_concurrency.lockutils [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Acquired lock "refresh_cache-31aed7b3-ea4a-4db4-b919-5d754f4c3b17" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 618.437540] env[59857]: DEBUG nova.network.neutron [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 618.534020] env[59857]: DEBUG nova.network.neutron [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 618.566705] env[59857]: ERROR nova.compute.manager [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 3f73c04b-d229-4bef-90ed-9080b13ee00b, please check neutron logs for more information. [ 618.566705] env[59857]: ERROR nova.compute.manager Traceback (most recent call last): [ 618.566705] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 618.566705] env[59857]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 618.566705] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 618.566705] env[59857]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 618.566705] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 618.566705] env[59857]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 618.566705] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 618.566705] env[59857]: ERROR nova.compute.manager self.force_reraise() [ 618.566705] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 618.566705] env[59857]: ERROR nova.compute.manager raise self.value [ 618.566705] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 618.566705] env[59857]: ERROR nova.compute.manager updated_port = self._update_port( [ 618.566705] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 618.566705] env[59857]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 618.567163] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 618.567163] env[59857]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 618.567163] env[59857]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 3f73c04b-d229-4bef-90ed-9080b13ee00b, please check neutron logs for more information. [ 618.567163] env[59857]: ERROR nova.compute.manager [ 618.567163] env[59857]: Traceback (most recent call last): [ 618.567163] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 618.567163] env[59857]: listener.cb(fileno) [ 618.567163] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 618.567163] env[59857]: result = function(*args, **kwargs) [ 618.567163] env[59857]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 618.567163] env[59857]: return func(*args, **kwargs) [ 618.567163] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 618.567163] env[59857]: raise e [ 618.567163] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 618.567163] env[59857]: nwinfo = self.network_api.allocate_for_instance( [ 618.567163] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 618.567163] env[59857]: created_port_ids = self._update_ports_for_instance( [ 618.567163] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 618.567163] env[59857]: with excutils.save_and_reraise_exception(): [ 618.567163] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 618.567163] env[59857]: self.force_reraise() [ 618.567163] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 618.567163] env[59857]: raise self.value [ 618.567163] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 618.567163] env[59857]: updated_port = self._update_port( [ 618.567163] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 618.567163] env[59857]: _ensure_no_port_binding_failure(port) [ 618.567163] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 618.567163] env[59857]: raise exception.PortBindingFailed(port_id=port['id']) [ 618.567984] env[59857]: nova.exception.PortBindingFailed: Binding failed for port 3f73c04b-d229-4bef-90ed-9080b13ee00b, please check neutron logs for more information. [ 618.567984] env[59857]: Removing descriptor: 12 [ 618.567984] env[59857]: ERROR nova.compute.manager [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] [instance: de589259-86c5-4830-8507-2de7ad76c034] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 3f73c04b-d229-4bef-90ed-9080b13ee00b, please check neutron logs for more information. [ 618.567984] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] Traceback (most recent call last): [ 618.567984] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 618.567984] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] yield resources [ 618.567984] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 618.567984] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] self.driver.spawn(context, instance, image_meta, [ 618.567984] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 618.567984] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] self._vmops.spawn(context, instance, image_meta, injected_files, [ 618.567984] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 618.567984] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] vm_ref = self.build_virtual_machine(instance, [ 618.568294] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 618.568294] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] vif_infos = vmwarevif.get_vif_info(self._session, [ 618.568294] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 618.568294] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] for vif in network_info: [ 618.568294] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 618.568294] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] return self._sync_wrapper(fn, *args, **kwargs) [ 618.568294] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 618.568294] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] self.wait() [ 618.568294] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 618.568294] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] self[:] = self._gt.wait() [ 618.568294] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 618.568294] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] return self._exit_event.wait() [ 618.568294] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 618.568294] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] result = hub.switch() [ 618.568653] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 618.568653] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] return self.greenlet.switch() [ 618.568653] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 618.568653] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] result = function(*args, **kwargs) [ 618.568653] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 618.568653] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] return func(*args, **kwargs) [ 618.568653] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 618.568653] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] raise e [ 618.568653] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 618.568653] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] nwinfo = self.network_api.allocate_for_instance( [ 618.568653] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 618.568653] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] created_port_ids = self._update_ports_for_instance( [ 618.568653] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 618.569073] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] with excutils.save_and_reraise_exception(): [ 618.569073] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 618.569073] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] self.force_reraise() [ 618.569073] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 618.569073] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] raise self.value [ 618.569073] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 618.569073] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] updated_port = self._update_port( [ 618.569073] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 618.569073] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] _ensure_no_port_binding_failure(port) [ 618.569073] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 618.569073] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] raise exception.PortBindingFailed(port_id=port['id']) [ 618.569073] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] nova.exception.PortBindingFailed: Binding failed for port 3f73c04b-d229-4bef-90ed-9080b13ee00b, please check neutron logs for more information. [ 618.569073] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] [ 618.569824] env[59857]: INFO nova.compute.manager [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] [instance: de589259-86c5-4830-8507-2de7ad76c034] Terminating instance [ 618.569824] env[59857]: DEBUG oslo_concurrency.lockutils [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Acquiring lock "refresh_cache-de589259-86c5-4830-8507-2de7ad76c034" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 618.569824] env[59857]: DEBUG oslo_concurrency.lockutils [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Acquired lock "refresh_cache-de589259-86c5-4830-8507-2de7ad76c034" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 618.569907] env[59857]: DEBUG nova.network.neutron [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] [instance: de589259-86c5-4830-8507-2de7ad76c034] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 618.624414] env[59857]: DEBUG nova.compute.manager [req-8a66fa67-9ca9-48c7-aba7-82672ed076e3 req-ff8ef086-31f6-434d-9ea9-331a32dbfcf1 service nova] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] Received event network-changed-4afc9255-bbe7-49d4-a0f3-47b43a0aa5b8 {{(pid=59857) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 618.624805] env[59857]: DEBUG nova.compute.manager [req-8a66fa67-9ca9-48c7-aba7-82672ed076e3 req-ff8ef086-31f6-434d-9ea9-331a32dbfcf1 service nova] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] Refreshing instance network info cache due to event network-changed-4afc9255-bbe7-49d4-a0f3-47b43a0aa5b8. {{(pid=59857) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 618.624805] env[59857]: DEBUG oslo_concurrency.lockutils [req-8a66fa67-9ca9-48c7-aba7-82672ed076e3 req-ff8ef086-31f6-434d-9ea9-331a32dbfcf1 service nova] Acquiring lock "refresh_cache-31aed7b3-ea4a-4db4-b919-5d754f4c3b17" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 618.652142] env[59857]: DEBUG nova.network.neutron [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] [instance: de589259-86c5-4830-8507-2de7ad76c034] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 618.786110] env[59857]: DEBUG oslo_concurrency.lockutils [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Acquiring lock "b7ab8792-137d-4053-9df9-3d560aa5e411" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 618.786110] env[59857]: DEBUG oslo_concurrency.lockutils [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Lock "b7ab8792-137d-4053-9df9-3d560aa5e411" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 618.800504] env[59857]: DEBUG nova.compute.manager [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] Starting instance... {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 618.856963] env[59857]: DEBUG oslo_concurrency.lockutils [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 618.857233] env[59857]: DEBUG oslo_concurrency.lockutils [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 618.860818] env[59857]: INFO nova.compute.claims [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 618.999016] env[59857]: DEBUG nova.network.neutron [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 619.012837] env[59857]: DEBUG oslo_concurrency.lockutils [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Releasing lock "refresh_cache-31aed7b3-ea4a-4db4-b919-5d754f4c3b17" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 619.013658] env[59857]: DEBUG nova.compute.manager [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] Start destroying the instance on the hypervisor. {{(pid=59857) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 619.013658] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] Destroying instance {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 619.014556] env[59857]: DEBUG oslo_concurrency.lockutils [req-8a66fa67-9ca9-48c7-aba7-82672ed076e3 req-ff8ef086-31f6-434d-9ea9-331a32dbfcf1 service nova] Acquired lock "refresh_cache-31aed7b3-ea4a-4db4-b919-5d754f4c3b17" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 619.014750] env[59857]: DEBUG nova.network.neutron [req-8a66fa67-9ca9-48c7-aba7-82672ed076e3 req-ff8ef086-31f6-434d-9ea9-331a32dbfcf1 service nova] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] Refreshing network info cache for port 4afc9255-bbe7-49d4-a0f3-47b43a0aa5b8 {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 619.015689] env[59857]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-19234b32-9ed0-46a0-8c58-af6375301d05 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.020016] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4f85030-57aa-4016-a467-f87d4e93e2c9 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.032113] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e8b527dc-4d9e-452b-821d-2844edc25d62 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.060428] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa347aac-0985-4dae-8540-179f9fbecffb {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.064387] env[59857]: WARNING nova.virt.vmwareapi.vmops [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 31aed7b3-ea4a-4db4-b919-5d754f4c3b17 could not be found. [ 619.064740] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] Instance destroyed {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 619.064991] env[59857]: INFO nova.compute.manager [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] Took 0.05 seconds to destroy the instance on the hypervisor. [ 619.066246] env[59857]: DEBUG oslo.service.loopingcall [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59857) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 619.066749] env[59857]: DEBUG nova.compute.manager [-] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 619.066841] env[59857]: DEBUG nova.network.neutron [-] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 619.098876] env[59857]: DEBUG nova.network.neutron [req-8a66fa67-9ca9-48c7-aba7-82672ed076e3 req-ff8ef086-31f6-434d-9ea9-331a32dbfcf1 service nova] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 619.101314] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3041fb32-5252-44f8-8708-caa1eab09ef3 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.109552] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da1d4061-0d1c-4f50-a918-5b6f6544f7df {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.115922] env[59857]: DEBUG nova.network.neutron [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] [instance: de589259-86c5-4830-8507-2de7ad76c034] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 619.125228] env[59857]: DEBUG nova.compute.provider_tree [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 619.129016] env[59857]: DEBUG oslo_concurrency.lockutils [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Releasing lock "refresh_cache-de589259-86c5-4830-8507-2de7ad76c034" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 619.129016] env[59857]: DEBUG nova.compute.manager [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] [instance: de589259-86c5-4830-8507-2de7ad76c034] Start destroying the instance on the hypervisor. {{(pid=59857) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 619.129016] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] [instance: de589259-86c5-4830-8507-2de7ad76c034] Destroying instance {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 619.129565] env[59857]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-faee9d46-a17b-4a08-8489-9154cf23ed31 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.139220] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4dcd7f1e-5cdb-4e0b-bca4-63f2c46286be {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.164464] env[59857]: WARNING nova.virt.vmwareapi.vmops [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] [instance: de589259-86c5-4830-8507-2de7ad76c034] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance de589259-86c5-4830-8507-2de7ad76c034 could not be found. [ 619.164717] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] [instance: de589259-86c5-4830-8507-2de7ad76c034] Instance destroyed {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 619.164867] env[59857]: INFO nova.compute.manager [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] [instance: de589259-86c5-4830-8507-2de7ad76c034] Took 0.04 seconds to destroy the instance on the hypervisor. [ 619.165106] env[59857]: DEBUG oslo.service.loopingcall [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59857) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 619.165976] env[59857]: DEBUG nova.scheduler.client.report [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 619.169018] env[59857]: DEBUG nova.compute.manager [-] [instance: de589259-86c5-4830-8507-2de7ad76c034] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 619.169114] env[59857]: DEBUG nova.network.neutron [-] [instance: de589259-86c5-4830-8507-2de7ad76c034] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 619.179236] env[59857]: DEBUG oslo_concurrency.lockutils [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.322s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 619.179729] env[59857]: DEBUG nova.compute.manager [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] Start building networks asynchronously for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 619.218037] env[59857]: DEBUG nova.compute.utils [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Using /dev/sd instead of None {{(pid=59857) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 619.218888] env[59857]: DEBUG nova.compute.manager [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] Not allocating networking since 'none' was specified. {{(pid=59857) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 619.227634] env[59857]: DEBUG nova.compute.manager [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] Start building block device mappings for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 619.243236] env[59857]: DEBUG nova.network.neutron [-] [instance: de589259-86c5-4830-8507-2de7ad76c034] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 619.252310] env[59857]: DEBUG nova.network.neutron [-] [instance: de589259-86c5-4830-8507-2de7ad76c034] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 619.273421] env[59857]: DEBUG nova.network.neutron [-] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 619.281956] env[59857]: DEBUG nova.network.neutron [-] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 619.283432] env[59857]: INFO nova.compute.manager [-] [instance: de589259-86c5-4830-8507-2de7ad76c034] Took 0.11 seconds to deallocate network for instance. [ 619.289384] env[59857]: DEBUG nova.compute.claims [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] [instance: de589259-86c5-4830-8507-2de7ad76c034] Aborting claim: {{(pid=59857) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 619.289564] env[59857]: DEBUG oslo_concurrency.lockutils [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 619.289794] env[59857]: DEBUG oslo_concurrency.lockutils [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 619.292700] env[59857]: INFO nova.compute.manager [-] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] Took 0.23 seconds to deallocate network for instance. [ 619.294426] env[59857]: DEBUG nova.compute.claims [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] Aborting claim: {{(pid=59857) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 619.294703] env[59857]: DEBUG oslo_concurrency.lockutils [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 619.325798] env[59857]: DEBUG nova.compute.manager [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] Start spawning the instance on the hypervisor. {{(pid=59857) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 619.350815] env[59857]: DEBUG nova.virt.hardware [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-23T14:43:14Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-23T14:42:57Z,direct_url=,disk_format='vmdk',id=4a4a4830-1ff7-4cff-ab75-d665942f46b5,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='88755ffc9d394d14928f0692771e61f5',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-23T14:42:58Z,virtual_size=,visibility=), allow threads: False {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 619.350815] env[59857]: DEBUG nova.virt.hardware [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Flavor limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 619.350815] env[59857]: DEBUG nova.virt.hardware [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Image limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 619.351058] env[59857]: DEBUG nova.virt.hardware [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Flavor pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 619.351058] env[59857]: DEBUG nova.virt.hardware [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Image pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 619.351058] env[59857]: DEBUG nova.virt.hardware [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 619.351058] env[59857]: DEBUG nova.virt.hardware [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 619.351058] env[59857]: DEBUG nova.virt.hardware [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 619.351227] env[59857]: DEBUG nova.virt.hardware [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Got 1 possible topologies {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 619.351227] env[59857]: DEBUG nova.virt.hardware [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 619.351344] env[59857]: DEBUG nova.virt.hardware [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 619.353210] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf2e65a6-05db-47b9-a318-9ba2784f23db {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.365389] env[59857]: DEBUG nova.network.neutron [req-8a66fa67-9ca9-48c7-aba7-82672ed076e3 req-ff8ef086-31f6-434d-9ea9-331a32dbfcf1 service nova] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 619.373533] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03edd3a9-9786-4283-973b-f39942b61da9 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.396813] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] Instance VIF info [] {{(pid=59857) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 619.401128] env[59857]: DEBUG nova.virt.vmwareapi.vm_util [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Creating folder: Project (db69c56463ff4c458b8adf0fe0ba520a). Parent ref: group-v286134. {{(pid=59857) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 619.404204] env[59857]: DEBUG oslo_concurrency.lockutils [req-8a66fa67-9ca9-48c7-aba7-82672ed076e3 req-ff8ef086-31f6-434d-9ea9-331a32dbfcf1 service nova] Releasing lock "refresh_cache-31aed7b3-ea4a-4db4-b919-5d754f4c3b17" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 619.404807] env[59857]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-34a9e594-f741-4643-a67d-e8265539c1ad {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.415916] env[59857]: INFO nova.virt.vmwareapi.vm_util [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Created folder: Project (db69c56463ff4c458b8adf0fe0ba520a) in parent group-v286134. [ 619.416135] env[59857]: DEBUG nova.virt.vmwareapi.vm_util [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Creating folder: Instances. Parent ref: group-v286138. {{(pid=59857) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 619.416135] env[59857]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6e803042-2d4f-4795-b584-70c1b75bbd50 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.425243] env[59857]: INFO nova.virt.vmwareapi.vm_util [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Created folder: Instances in parent group-v286138. [ 619.425243] env[59857]: DEBUG oslo.service.loopingcall [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59857) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 619.425418] env[59857]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] Creating VM on the ESX host {{(pid=59857) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 619.425581] env[59857]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-0e62ed35-0eba-4623-bc4c-7819850314cb {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.446693] env[59857]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 619.446693] env[59857]: value = "task-1341418" [ 619.446693] env[59857]: _type = "Task" [ 619.446693] env[59857]: } to complete. {{(pid=59857) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 619.454500] env[59857]: DEBUG oslo_vmware.api [-] Task: {'id': task-1341418, 'name': CreateVM_Task} progress is 0%. {{(pid=59857) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 619.469586] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79876054-68da-4006-8324-4011b361b589 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.477821] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7ff1cab-b9bb-42fb-b71c-327e50516625 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.518594] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d00b011f-f604-4c26-a527-9d88243a27fe {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.525027] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fcc7797b-4eec-4caf-b567-e6b35a768c69 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.538838] env[59857]: DEBUG nova.compute.provider_tree [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 619.555061] env[59857]: DEBUG nova.scheduler.client.report [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 619.574564] env[59857]: DEBUG oslo_concurrency.lockutils [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.285s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 619.575199] env[59857]: ERROR nova.compute.manager [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] [instance: de589259-86c5-4830-8507-2de7ad76c034] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 3f73c04b-d229-4bef-90ed-9080b13ee00b, please check neutron logs for more information. [ 619.575199] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] Traceback (most recent call last): [ 619.575199] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 619.575199] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] self.driver.spawn(context, instance, image_meta, [ 619.575199] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 619.575199] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] self._vmops.spawn(context, instance, image_meta, injected_files, [ 619.575199] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 619.575199] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] vm_ref = self.build_virtual_machine(instance, [ 619.575199] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 619.575199] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] vif_infos = vmwarevif.get_vif_info(self._session, [ 619.575199] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 619.575561] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] for vif in network_info: [ 619.575561] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 619.575561] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] return self._sync_wrapper(fn, *args, **kwargs) [ 619.575561] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 619.575561] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] self.wait() [ 619.575561] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 619.575561] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] self[:] = self._gt.wait() [ 619.575561] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 619.575561] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] return self._exit_event.wait() [ 619.575561] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 619.575561] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] result = hub.switch() [ 619.575561] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 619.575561] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] return self.greenlet.switch() [ 619.575561] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 619.576106] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] result = function(*args, **kwargs) [ 619.576106] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 619.576106] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] return func(*args, **kwargs) [ 619.576106] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 619.576106] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] raise e [ 619.576106] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 619.576106] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] nwinfo = self.network_api.allocate_for_instance( [ 619.576106] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 619.576106] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] created_port_ids = self._update_ports_for_instance( [ 619.576106] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 619.576106] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] with excutils.save_and_reraise_exception(): [ 619.576106] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 619.576106] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] self.force_reraise() [ 619.576437] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 619.576437] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] raise self.value [ 619.576437] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 619.576437] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] updated_port = self._update_port( [ 619.576437] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 619.576437] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] _ensure_no_port_binding_failure(port) [ 619.576437] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 619.576437] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] raise exception.PortBindingFailed(port_id=port['id']) [ 619.576437] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] nova.exception.PortBindingFailed: Binding failed for port 3f73c04b-d229-4bef-90ed-9080b13ee00b, please check neutron logs for more information. [ 619.576437] env[59857]: ERROR nova.compute.manager [instance: de589259-86c5-4830-8507-2de7ad76c034] [ 619.576759] env[59857]: DEBUG nova.compute.utils [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] [instance: de589259-86c5-4830-8507-2de7ad76c034] Binding failed for port 3f73c04b-d229-4bef-90ed-9080b13ee00b, please check neutron logs for more information. {{(pid=59857) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 619.577073] env[59857]: DEBUG oslo_concurrency.lockutils [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.282s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 619.582195] env[59857]: DEBUG nova.compute.manager [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] [instance: de589259-86c5-4830-8507-2de7ad76c034] Build of instance de589259-86c5-4830-8507-2de7ad76c034 was re-scheduled: Binding failed for port 3f73c04b-d229-4bef-90ed-9080b13ee00b, please check neutron logs for more information. {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 619.582636] env[59857]: DEBUG nova.compute.manager [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] [instance: de589259-86c5-4830-8507-2de7ad76c034] Unplugging VIFs for instance {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 619.582849] env[59857]: DEBUG oslo_concurrency.lockutils [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Acquiring lock "refresh_cache-de589259-86c5-4830-8507-2de7ad76c034" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 619.582983] env[59857]: DEBUG oslo_concurrency.lockutils [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Acquired lock "refresh_cache-de589259-86c5-4830-8507-2de7ad76c034" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 619.583151] env[59857]: DEBUG nova.network.neutron [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] [instance: de589259-86c5-4830-8507-2de7ad76c034] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 619.640908] env[59857]: DEBUG nova.network.neutron [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] [instance: de589259-86c5-4830-8507-2de7ad76c034] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 619.747812] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de83a767-18aa-4611-92f7-75105b3423ee {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.757933] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a7211a80-b794-425d-be7a-0ef425c8d391 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.796249] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e03f00bc-4d17-4aa3-8c7b-13c11a5cc657 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.804682] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9720125c-478a-4eb0-9371-43998ef2935f {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.810925] env[59857]: DEBUG nova.network.neutron [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] [instance: de589259-86c5-4830-8507-2de7ad76c034] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 619.828006] env[59857]: DEBUG nova.compute.provider_tree [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 619.832356] env[59857]: DEBUG oslo_concurrency.lockutils [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Releasing lock "refresh_cache-de589259-86c5-4830-8507-2de7ad76c034" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 619.832569] env[59857]: DEBUG nova.compute.manager [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 619.832719] env[59857]: DEBUG nova.compute.manager [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] [instance: de589259-86c5-4830-8507-2de7ad76c034] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 619.832883] env[59857]: DEBUG nova.network.neutron [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] [instance: de589259-86c5-4830-8507-2de7ad76c034] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 619.840699] env[59857]: DEBUG nova.scheduler.client.report [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 619.860284] env[59857]: DEBUG oslo_concurrency.lockutils [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.283s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 619.861945] env[59857]: ERROR nova.compute.manager [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 4afc9255-bbe7-49d4-a0f3-47b43a0aa5b8, please check neutron logs for more information. [ 619.861945] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] Traceback (most recent call last): [ 619.861945] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 619.861945] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] self.driver.spawn(context, instance, image_meta, [ 619.861945] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 619.861945] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] self._vmops.spawn(context, instance, image_meta, injected_files, [ 619.861945] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 619.861945] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] vm_ref = self.build_virtual_machine(instance, [ 619.861945] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 619.861945] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] vif_infos = vmwarevif.get_vif_info(self._session, [ 619.861945] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 619.862286] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] for vif in network_info: [ 619.862286] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 619.862286] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] return self._sync_wrapper(fn, *args, **kwargs) [ 619.862286] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 619.862286] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] self.wait() [ 619.862286] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 619.862286] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] self[:] = self._gt.wait() [ 619.862286] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 619.862286] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] return self._exit_event.wait() [ 619.862286] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 619.862286] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] result = hub.switch() [ 619.862286] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 619.862286] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] return self.greenlet.switch() [ 619.862286] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 619.862718] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] result = function(*args, **kwargs) [ 619.862718] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 619.862718] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] return func(*args, **kwargs) [ 619.862718] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 619.862718] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] raise e [ 619.862718] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 619.862718] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] nwinfo = self.network_api.allocate_for_instance( [ 619.862718] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 619.862718] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] created_port_ids = self._update_ports_for_instance( [ 619.862718] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 619.862718] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] with excutils.save_and_reraise_exception(): [ 619.862718] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 619.862718] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] self.force_reraise() [ 619.863089] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 619.863089] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] raise self.value [ 619.863089] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 619.863089] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] updated_port = self._update_port( [ 619.863089] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 619.863089] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] _ensure_no_port_binding_failure(port) [ 619.863089] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 619.863089] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] raise exception.PortBindingFailed(port_id=port['id']) [ 619.863089] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] nova.exception.PortBindingFailed: Binding failed for port 4afc9255-bbe7-49d4-a0f3-47b43a0aa5b8, please check neutron logs for more information. [ 619.863089] env[59857]: ERROR nova.compute.manager [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] [ 619.863089] env[59857]: DEBUG nova.compute.utils [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] Binding failed for port 4afc9255-bbe7-49d4-a0f3-47b43a0aa5b8, please check neutron logs for more information. {{(pid=59857) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 619.865194] env[59857]: DEBUG nova.compute.manager [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] Build of instance 31aed7b3-ea4a-4db4-b919-5d754f4c3b17 was re-scheduled: Binding failed for port 4afc9255-bbe7-49d4-a0f3-47b43a0aa5b8, please check neutron logs for more information. {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 619.865194] env[59857]: DEBUG nova.compute.manager [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] Unplugging VIFs for instance {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 619.867130] env[59857]: DEBUG oslo_concurrency.lockutils [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Acquiring lock "refresh_cache-31aed7b3-ea4a-4db4-b919-5d754f4c3b17" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 619.867130] env[59857]: DEBUG oslo_concurrency.lockutils [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Acquired lock "refresh_cache-31aed7b3-ea4a-4db4-b919-5d754f4c3b17" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 619.867130] env[59857]: DEBUG nova.network.neutron [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 619.877283] env[59857]: DEBUG nova.network.neutron [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] [instance: de589259-86c5-4830-8507-2de7ad76c034] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 619.891018] env[59857]: DEBUG nova.network.neutron [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] [instance: de589259-86c5-4830-8507-2de7ad76c034] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 619.908138] env[59857]: INFO nova.compute.manager [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] [instance: de589259-86c5-4830-8507-2de7ad76c034] Took 0.08 seconds to deallocate network for instance. [ 619.954724] env[59857]: DEBUG nova.network.neutron [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 619.964647] env[59857]: DEBUG oslo_vmware.api [-] Task: {'id': task-1341418, 'name': CreateVM_Task, 'duration_secs': 0.244476} completed successfully. {{(pid=59857) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 619.964647] env[59857]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] Created VM on the ESX host {{(pid=59857) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 619.964647] env[59857]: DEBUG oslo_concurrency.lockutils [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/4a4a4830-1ff7-4cff-ab75-d665942f46b5" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 619.964647] env[59857]: DEBUG oslo_concurrency.lockutils [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Acquired lock "[datastore2] devstack-image-cache_base/4a4a4830-1ff7-4cff-ab75-d665942f46b5" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 619.964647] env[59857]: DEBUG oslo_concurrency.lockutils [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/4a4a4830-1ff7-4cff-ab75-d665942f46b5" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 619.965895] env[59857]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1566f872-2ee0-46af-a6c8-71151c3c7a06 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.977700] env[59857]: DEBUG oslo_vmware.api [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Waiting for the task: (returnval){ [ 619.977700] env[59857]: value = "session[5271f050-9b9a-180d-7d1b-498ca49bf02c]52e2e333-3b9b-07d9-331a-efad42a741a6" [ 619.977700] env[59857]: _type = "Task" [ 619.977700] env[59857]: } to complete. {{(pid=59857) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 619.989815] env[59857]: DEBUG oslo_vmware.api [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Task: {'id': session[5271f050-9b9a-180d-7d1b-498ca49bf02c]52e2e333-3b9b-07d9-331a-efad42a741a6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59857) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 620.040366] env[59857]: INFO nova.scheduler.client.report [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Deleted allocations for instance de589259-86c5-4830-8507-2de7ad76c034 [ 620.076259] env[59857]: DEBUG oslo_concurrency.lockutils [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Lock "de589259-86c5-4830-8507-2de7ad76c034" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 8.938s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 620.113541] env[59857]: DEBUG nova.network.neutron [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 620.124192] env[59857]: DEBUG oslo_concurrency.lockutils [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Releasing lock "refresh_cache-31aed7b3-ea4a-4db4-b919-5d754f4c3b17" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 620.124192] env[59857]: DEBUG nova.compute.manager [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 620.124192] env[59857]: DEBUG nova.compute.manager [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 620.124431] env[59857]: DEBUG nova.network.neutron [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 620.187202] env[59857]: DEBUG nova.network.neutron [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 620.201008] env[59857]: DEBUG nova.network.neutron [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 620.215025] env[59857]: INFO nova.compute.manager [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] Took 0.09 seconds to deallocate network for instance. [ 620.354731] env[59857]: INFO nova.scheduler.client.report [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Deleted allocations for instance 31aed7b3-ea4a-4db4-b919-5d754f4c3b17 [ 620.392281] env[59857]: DEBUG oslo_concurrency.lockutils [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Lock "31aed7b3-ea4a-4db4-b919-5d754f4c3b17" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 9.516s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 620.457543] env[59857]: DEBUG oslo_concurrency.lockutils [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Acquiring lock "5c575e05-5a7c-49b8-b914-9b4a4e347bfc" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 620.464260] env[59857]: DEBUG oslo_concurrency.lockutils [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Lock "5c575e05-5a7c-49b8-b914-9b4a4e347bfc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.007s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 620.487990] env[59857]: DEBUG nova.compute.manager [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Starting instance... {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 620.493069] env[59857]: DEBUG oslo_concurrency.lockutils [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Releasing lock "[datastore2] devstack-image-cache_base/4a4a4830-1ff7-4cff-ab75-d665942f46b5" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 620.493427] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] Processing image 4a4a4830-1ff7-4cff-ab75-d665942f46b5 {{(pid=59857) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 620.493526] env[59857]: DEBUG oslo_concurrency.lockutils [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/4a4a4830-1ff7-4cff-ab75-d665942f46b5/4a4a4830-1ff7-4cff-ab75-d665942f46b5.vmdk" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 620.553018] env[59857]: DEBUG oslo_concurrency.lockutils [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 620.553018] env[59857]: DEBUG oslo_concurrency.lockutils [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 620.554611] env[59857]: INFO nova.compute.claims [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 620.716019] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5652ebce-a204-4aa8-b6b1-31bf11d42cb1 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.728410] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-245c42f2-1ed0-42bf-9d82-8f8b4ddbba5a {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.765301] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ada13527-971c-49ce-90a1-9fd4e1c13a7c {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.772448] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3f809f0-f989-4192-9648-e32734289f94 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.787020] env[59857]: DEBUG nova.compute.provider_tree [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 620.800973] env[59857]: DEBUG nova.scheduler.client.report [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 620.815809] env[59857]: DEBUG oslo_concurrency.lockutils [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.264s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 620.816419] env[59857]: DEBUG nova.compute.manager [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Start building networks asynchronously for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 620.856526] env[59857]: DEBUG nova.compute.utils [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Using /dev/sd instead of None {{(pid=59857) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 620.862617] env[59857]: DEBUG nova.compute.manager [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Not allocating networking since 'none' was specified. {{(pid=59857) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 620.871590] env[59857]: DEBUG nova.compute.manager [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Start building block device mappings for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 620.943823] env[59857]: ERROR nova.compute.manager [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 717ab373-c7e5-4f17-99ea-5059dde6cc33, please check neutron logs for more information. [ 620.943823] env[59857]: ERROR nova.compute.manager Traceback (most recent call last): [ 620.943823] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 620.943823] env[59857]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 620.943823] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 620.943823] env[59857]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 620.943823] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 620.943823] env[59857]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 620.943823] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 620.943823] env[59857]: ERROR nova.compute.manager self.force_reraise() [ 620.943823] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 620.943823] env[59857]: ERROR nova.compute.manager raise self.value [ 620.943823] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 620.943823] env[59857]: ERROR nova.compute.manager updated_port = self._update_port( [ 620.943823] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 620.943823] env[59857]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 620.944286] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 620.944286] env[59857]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 620.944286] env[59857]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 717ab373-c7e5-4f17-99ea-5059dde6cc33, please check neutron logs for more information. [ 620.944286] env[59857]: ERROR nova.compute.manager [ 620.944286] env[59857]: Traceback (most recent call last): [ 620.944286] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 620.944286] env[59857]: listener.cb(fileno) [ 620.944286] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 620.944286] env[59857]: result = function(*args, **kwargs) [ 620.944286] env[59857]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 620.944286] env[59857]: return func(*args, **kwargs) [ 620.944286] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 620.944286] env[59857]: raise e [ 620.944286] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 620.944286] env[59857]: nwinfo = self.network_api.allocate_for_instance( [ 620.944286] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 620.944286] env[59857]: created_port_ids = self._update_ports_for_instance( [ 620.944286] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 620.944286] env[59857]: with excutils.save_and_reraise_exception(): [ 620.944286] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 620.944286] env[59857]: self.force_reraise() [ 620.944286] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 620.944286] env[59857]: raise self.value [ 620.944286] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 620.944286] env[59857]: updated_port = self._update_port( [ 620.944286] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 620.944286] env[59857]: _ensure_no_port_binding_failure(port) [ 620.944286] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 620.944286] env[59857]: raise exception.PortBindingFailed(port_id=port['id']) [ 620.944992] env[59857]: nova.exception.PortBindingFailed: Binding failed for port 717ab373-c7e5-4f17-99ea-5059dde6cc33, please check neutron logs for more information. [ 620.944992] env[59857]: Removing descriptor: 15 [ 620.944992] env[59857]: ERROR nova.compute.manager [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 717ab373-c7e5-4f17-99ea-5059dde6cc33, please check neutron logs for more information. [ 620.944992] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] Traceback (most recent call last): [ 620.944992] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 620.944992] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] yield resources [ 620.944992] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 620.944992] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] self.driver.spawn(context, instance, image_meta, [ 620.944992] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 620.944992] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 620.944992] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 620.944992] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] vm_ref = self.build_virtual_machine(instance, [ 620.945338] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 620.945338] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] vif_infos = vmwarevif.get_vif_info(self._session, [ 620.945338] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 620.945338] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] for vif in network_info: [ 620.945338] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 620.945338] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] return self._sync_wrapper(fn, *args, **kwargs) [ 620.945338] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 620.945338] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] self.wait() [ 620.945338] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 620.945338] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] self[:] = self._gt.wait() [ 620.945338] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 620.945338] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] return self._exit_event.wait() [ 620.945338] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 620.945338] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] result = hub.switch() [ 620.945696] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 620.945696] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] return self.greenlet.switch() [ 620.945696] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 620.945696] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] result = function(*args, **kwargs) [ 620.945696] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 620.945696] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] return func(*args, **kwargs) [ 620.945696] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 620.945696] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] raise e [ 620.945696] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 620.945696] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] nwinfo = self.network_api.allocate_for_instance( [ 620.945696] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 620.945696] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] created_port_ids = self._update_ports_for_instance( [ 620.945696] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 620.946015] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] with excutils.save_and_reraise_exception(): [ 620.946015] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 620.946015] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] self.force_reraise() [ 620.946015] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 620.946015] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] raise self.value [ 620.946015] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 620.946015] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] updated_port = self._update_port( [ 620.946015] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 620.946015] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] _ensure_no_port_binding_failure(port) [ 620.946015] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 620.946015] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] raise exception.PortBindingFailed(port_id=port['id']) [ 620.946015] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] nova.exception.PortBindingFailed: Binding failed for port 717ab373-c7e5-4f17-99ea-5059dde6cc33, please check neutron logs for more information. [ 620.946015] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] [ 620.946355] env[59857]: INFO nova.compute.manager [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] Terminating instance [ 620.947378] env[59857]: DEBUG oslo_concurrency.lockutils [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Acquiring lock "refresh_cache-17c89372-97f2-4ffa-a13e-606d4f31b08f" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 620.950944] env[59857]: DEBUG oslo_concurrency.lockutils [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Acquired lock "refresh_cache-17c89372-97f2-4ffa-a13e-606d4f31b08f" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 620.950944] env[59857]: DEBUG nova.network.neutron [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 620.968438] env[59857]: DEBUG nova.compute.manager [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Start spawning the instance on the hypervisor. {{(pid=59857) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 620.998817] env[59857]: DEBUG nova.virt.hardware [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-23T14:43:14Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-23T14:42:57Z,direct_url=,disk_format='vmdk',id=4a4a4830-1ff7-4cff-ab75-d665942f46b5,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='88755ffc9d394d14928f0692771e61f5',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-23T14:42:58Z,virtual_size=,visibility=), allow threads: False {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 620.999044] env[59857]: DEBUG nova.virt.hardware [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Flavor limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 620.999197] env[59857]: DEBUG nova.virt.hardware [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Image limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 620.999371] env[59857]: DEBUG nova.virt.hardware [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Flavor pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 620.999570] env[59857]: DEBUG nova.virt.hardware [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Image pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 620.999644] env[59857]: DEBUG nova.virt.hardware [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 620.999836] env[59857]: DEBUG nova.virt.hardware [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 620.999984] env[59857]: DEBUG nova.virt.hardware [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 621.000156] env[59857]: DEBUG nova.virt.hardware [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Got 1 possible topologies {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 621.000310] env[59857]: DEBUG nova.virt.hardware [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 621.000499] env[59857]: DEBUG nova.virt.hardware [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 621.001387] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5a8630c-bfc1-4110-9539-c4845e97f307 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 621.010046] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66c17dd9-d0b1-4f00-9009-a06174a21c49 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 621.025208] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Instance VIF info [] {{(pid=59857) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 621.034780] env[59857]: DEBUG nova.virt.vmwareapi.vm_util [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Creating folder: Project (8a8269fa65f541b098ed07cc3679e448). Parent ref: group-v286134. {{(pid=59857) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 621.034780] env[59857]: DEBUG nova.network.neutron [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 621.034780] env[59857]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-be6d38d2-1d00-4b43-a9b4-b37702af3109 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 621.046883] env[59857]: INFO nova.virt.vmwareapi.vm_util [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Created folder: Project (8a8269fa65f541b098ed07cc3679e448) in parent group-v286134. [ 621.047095] env[59857]: DEBUG nova.virt.vmwareapi.vm_util [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Creating folder: Instances. Parent ref: group-v286141. {{(pid=59857) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 621.047327] env[59857]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1a395cd1-2094-4fd2-8594-a23b95bb4ea5 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 621.058724] env[59857]: INFO nova.virt.vmwareapi.vm_util [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Created folder: Instances in parent group-v286141. [ 621.058724] env[59857]: DEBUG oslo.service.loopingcall [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59857) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 621.058724] env[59857]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Creating VM on the ESX host {{(pid=59857) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 621.059216] env[59857]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b2b6da16-09a3-426d-bcb7-9b4c7aea4791 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 621.083481] env[59857]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 621.083481] env[59857]: value = "task-1341421" [ 621.083481] env[59857]: _type = "Task" [ 621.083481] env[59857]: } to complete. {{(pid=59857) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 621.097753] env[59857]: DEBUG oslo_vmware.api [-] Task: {'id': task-1341421, 'name': CreateVM_Task} progress is 0%. {{(pid=59857) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 621.173141] env[59857]: DEBUG nova.network.neutron [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 621.184826] env[59857]: DEBUG oslo_concurrency.lockutils [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Releasing lock "refresh_cache-17c89372-97f2-4ffa-a13e-606d4f31b08f" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 621.185646] env[59857]: DEBUG nova.compute.manager [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] Start destroying the instance on the hypervisor. {{(pid=59857) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 621.185839] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] Destroying instance {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 621.187187] env[59857]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-8e5f88a7-af3b-4ac3-a4b4-84ae5a0b5d02 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 621.198699] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-303efd51-a060-4edd-97a6-103d027f0d3f {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 621.227324] env[59857]: WARNING nova.virt.vmwareapi.vmops [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 17c89372-97f2-4ffa-a13e-606d4f31b08f could not be found. [ 621.227513] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] Instance destroyed {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 621.227698] env[59857]: INFO nova.compute.manager [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] Took 0.04 seconds to destroy the instance on the hypervisor. [ 621.227904] env[59857]: DEBUG oslo.service.loopingcall [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59857) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 621.228583] env[59857]: DEBUG nova.compute.manager [-] [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 621.228583] env[59857]: DEBUG nova.network.neutron [-] [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 621.255816] env[59857]: DEBUG nova.network.neutron [-] [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 621.268275] env[59857]: DEBUG nova.network.neutron [-] [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 621.280946] env[59857]: INFO nova.compute.manager [-] [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] Took 0.05 seconds to deallocate network for instance. [ 621.283340] env[59857]: DEBUG nova.compute.claims [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] Aborting claim: {{(pid=59857) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 621.283596] env[59857]: DEBUG oslo_concurrency.lockutils [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 621.284031] env[59857]: DEBUG oslo_concurrency.lockutils [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 621.431415] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2a6cbd0-f746-46c0-8e4a-49db310d0b03 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 621.440013] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8288829-6d42-4cfc-a7df-fcfe29874bd5 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 621.481451] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19287a3c-b0e8-4480-8ae0-efe626ef328c {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 621.489514] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b56fb059-0eb5-465a-b882-a9296a9ef7be {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 621.506836] env[59857]: DEBUG nova.compute.provider_tree [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 621.515234] env[59857]: DEBUG nova.scheduler.client.report [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 621.532106] env[59857]: DEBUG oslo_concurrency.lockutils [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.248s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 621.532727] env[59857]: ERROR nova.compute.manager [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 717ab373-c7e5-4f17-99ea-5059dde6cc33, please check neutron logs for more information. [ 621.532727] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] Traceback (most recent call last): [ 621.532727] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 621.532727] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] self.driver.spawn(context, instance, image_meta, [ 621.532727] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 621.532727] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 621.532727] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 621.532727] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] vm_ref = self.build_virtual_machine(instance, [ 621.532727] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 621.532727] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] vif_infos = vmwarevif.get_vif_info(self._session, [ 621.532727] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 621.533099] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] for vif in network_info: [ 621.533099] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 621.533099] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] return self._sync_wrapper(fn, *args, **kwargs) [ 621.533099] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 621.533099] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] self.wait() [ 621.533099] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 621.533099] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] self[:] = self._gt.wait() [ 621.533099] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 621.533099] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] return self._exit_event.wait() [ 621.533099] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 621.533099] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] result = hub.switch() [ 621.533099] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 621.533099] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] return self.greenlet.switch() [ 621.533099] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 621.533463] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] result = function(*args, **kwargs) [ 621.533463] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 621.533463] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] return func(*args, **kwargs) [ 621.533463] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 621.533463] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] raise e [ 621.533463] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 621.533463] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] nwinfo = self.network_api.allocate_for_instance( [ 621.533463] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 621.533463] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] created_port_ids = self._update_ports_for_instance( [ 621.533463] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 621.533463] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] with excutils.save_and_reraise_exception(): [ 621.533463] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 621.533463] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] self.force_reraise() [ 621.533763] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 621.533763] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] raise self.value [ 621.533763] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 621.533763] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] updated_port = self._update_port( [ 621.533763] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 621.533763] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] _ensure_no_port_binding_failure(port) [ 621.533763] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 621.533763] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] raise exception.PortBindingFailed(port_id=port['id']) [ 621.533763] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] nova.exception.PortBindingFailed: Binding failed for port 717ab373-c7e5-4f17-99ea-5059dde6cc33, please check neutron logs for more information. [ 621.533763] env[59857]: ERROR nova.compute.manager [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] [ 621.533763] env[59857]: DEBUG nova.compute.utils [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] Binding failed for port 717ab373-c7e5-4f17-99ea-5059dde6cc33, please check neutron logs for more information. {{(pid=59857) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 621.535898] env[59857]: DEBUG nova.compute.manager [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] Build of instance 17c89372-97f2-4ffa-a13e-606d4f31b08f was re-scheduled: Binding failed for port 717ab373-c7e5-4f17-99ea-5059dde6cc33, please check neutron logs for more information. {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 621.535898] env[59857]: DEBUG nova.compute.manager [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] Unplugging VIFs for instance {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 621.535898] env[59857]: DEBUG oslo_concurrency.lockutils [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Acquiring lock "refresh_cache-17c89372-97f2-4ffa-a13e-606d4f31b08f" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 621.535898] env[59857]: DEBUG oslo_concurrency.lockutils [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Acquired lock "refresh_cache-17c89372-97f2-4ffa-a13e-606d4f31b08f" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 621.538058] env[59857]: DEBUG nova.network.neutron [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 621.594998] env[59857]: DEBUG oslo_vmware.api [-] Task: {'id': task-1341421, 'name': CreateVM_Task, 'duration_secs': 0.255447} completed successfully. {{(pid=59857) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 621.596210] env[59857]: DEBUG nova.network.neutron [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 621.598609] env[59857]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Created VM on the ESX host {{(pid=59857) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 621.599117] env[59857]: DEBUG oslo_concurrency.lockutils [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/4a4a4830-1ff7-4cff-ab75-d665942f46b5" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 621.599393] env[59857]: DEBUG oslo_concurrency.lockutils [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Acquired lock "[datastore2] devstack-image-cache_base/4a4a4830-1ff7-4cff-ab75-d665942f46b5" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 621.599856] env[59857]: DEBUG oslo_concurrency.lockutils [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/4a4a4830-1ff7-4cff-ab75-d665942f46b5" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 621.602029] env[59857]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-07f5c0be-f177-451f-9e2e-ed405c31185d {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 621.607128] env[59857]: DEBUG oslo_vmware.api [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Waiting for the task: (returnval){ [ 621.607128] env[59857]: value = "session[5271f050-9b9a-180d-7d1b-498ca49bf02c]5265522a-10e7-3081-2d15-45cd98cfb2ed" [ 621.607128] env[59857]: _type = "Task" [ 621.607128] env[59857]: } to complete. {{(pid=59857) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 621.616487] env[59857]: DEBUG oslo_vmware.api [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Task: {'id': session[5271f050-9b9a-180d-7d1b-498ca49bf02c]5265522a-10e7-3081-2d15-45cd98cfb2ed, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59857) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 621.824655] env[59857]: DEBUG nova.network.neutron [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 621.846960] env[59857]: DEBUG oslo_concurrency.lockutils [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Releasing lock "refresh_cache-17c89372-97f2-4ffa-a13e-606d4f31b08f" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 621.846960] env[59857]: DEBUG nova.compute.manager [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 621.847440] env[59857]: DEBUG nova.compute.manager [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 621.847616] env[59857]: DEBUG nova.network.neutron [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 621.940193] env[59857]: DEBUG nova.network.neutron [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 621.954320] env[59857]: DEBUG nova.network.neutron [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 621.975880] env[59857]: INFO nova.compute.manager [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] Took 0.13 seconds to deallocate network for instance. [ 622.095120] env[59857]: INFO nova.scheduler.client.report [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Deleted allocations for instance 17c89372-97f2-4ffa-a13e-606d4f31b08f [ 622.121929] env[59857]: DEBUG oslo_concurrency.lockutils [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Releasing lock "[datastore2] devstack-image-cache_base/4a4a4830-1ff7-4cff-ab75-d665942f46b5" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 622.121929] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Processing image 4a4a4830-1ff7-4cff-ab75-d665942f46b5 {{(pid=59857) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 622.121929] env[59857]: DEBUG oslo_concurrency.lockutils [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/4a4a4830-1ff7-4cff-ab75-d665942f46b5/4a4a4830-1ff7-4cff-ab75-d665942f46b5.vmdk" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 622.126844] env[59857]: DEBUG oslo_concurrency.lockutils [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Lock "17c89372-97f2-4ffa-a13e-606d4f31b08f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 7.921s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 624.423687] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Acquiring lock "11f468ba-a807-4490-9dd5-58eaad007865" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 624.424090] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Lock "11f468ba-a807-4490-9dd5-58eaad007865" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 624.440788] env[59857]: DEBUG nova.compute.manager [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Starting instance... {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 624.512904] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 624.512904] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 624.512904] env[59857]: INFO nova.compute.claims [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 624.804440] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4dd1361-c891-4704-9820-f96143cfd47d {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 624.817945] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-edd2ad2a-cca8-4eb9-b31a-2aff7d59059b {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 624.858594] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cafd029f-b8ad-4f99-be79-f3cbb40590fd {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 624.869798] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-432882da-f5dc-44bb-82fd-74382975750b {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 624.888944] env[59857]: DEBUG nova.compute.provider_tree [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 624.899466] env[59857]: DEBUG nova.scheduler.client.report [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 624.917428] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.406s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 624.918879] env[59857]: DEBUG nova.compute.manager [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Start building networks asynchronously for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 624.964252] env[59857]: DEBUG nova.compute.utils [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Using /dev/sd instead of None {{(pid=59857) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 624.966409] env[59857]: DEBUG nova.compute.manager [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Not allocating networking since 'none' was specified. {{(pid=59857) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 624.981078] env[59857]: DEBUG nova.compute.manager [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Start building block device mappings for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 625.062990] env[59857]: DEBUG nova.compute.manager [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Start spawning the instance on the hypervisor. {{(pid=59857) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 625.090330] env[59857]: DEBUG nova.virt.hardware [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-23T14:43:14Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-23T14:42:57Z,direct_url=,disk_format='vmdk',id=4a4a4830-1ff7-4cff-ab75-d665942f46b5,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='88755ffc9d394d14928f0692771e61f5',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-23T14:42:58Z,virtual_size=,visibility=), allow threads: False {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 625.090704] env[59857]: DEBUG nova.virt.hardware [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Flavor limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 625.090704] env[59857]: DEBUG nova.virt.hardware [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Image limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 625.090865] env[59857]: DEBUG nova.virt.hardware [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Flavor pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 625.094352] env[59857]: DEBUG nova.virt.hardware [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Image pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 625.094617] env[59857]: DEBUG nova.virt.hardware [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 625.094912] env[59857]: DEBUG nova.virt.hardware [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 625.095014] env[59857]: DEBUG nova.virt.hardware [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 625.095188] env[59857]: DEBUG nova.virt.hardware [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Got 1 possible topologies {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 625.095388] env[59857]: DEBUG nova.virt.hardware [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 625.095561] env[59857]: DEBUG nova.virt.hardware [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 625.099552] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c4954ff-2c49-4485-a9ed-2c21ddfaed81 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.113019] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-697062e1-ee4b-4a81-a7d8-bbdb6d543127 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.127903] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Instance VIF info [] {{(pid=59857) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 625.136022] env[59857]: DEBUG nova.virt.vmwareapi.vm_util [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Creating folder: Project (6a76486869574011a18c114d723d40aa). Parent ref: group-v286134. {{(pid=59857) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 625.136022] env[59857]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4fb67159-4646-4beb-b296-cc1a8e837c52 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.148373] env[59857]: INFO nova.virt.vmwareapi.vm_util [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Created folder: Project (6a76486869574011a18c114d723d40aa) in parent group-v286134. [ 625.148552] env[59857]: DEBUG nova.virt.vmwareapi.vm_util [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Creating folder: Instances. Parent ref: group-v286144. {{(pid=59857) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 625.148768] env[59857]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-fed89fc8-e4f6-4ea1-bae7-82b402cf934e {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.159787] env[59857]: INFO nova.virt.vmwareapi.vm_util [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Created folder: Instances in parent group-v286144. [ 625.160028] env[59857]: DEBUG oslo.service.loopingcall [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59857) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 625.160238] env[59857]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Creating VM on the ESX host {{(pid=59857) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 625.160534] env[59857]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-3f8bef09-0809-471f-b711-8db5d77cfa61 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.178366] env[59857]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 625.178366] env[59857]: value = "task-1341424" [ 625.178366] env[59857]: _type = "Task" [ 625.178366] env[59857]: } to complete. {{(pid=59857) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 625.185871] env[59857]: DEBUG oslo_vmware.api [-] Task: {'id': task-1341424, 'name': CreateVM_Task} progress is 0%. {{(pid=59857) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 625.693482] env[59857]: DEBUG oslo_vmware.api [-] Task: {'id': task-1341424, 'name': CreateVM_Task, 'duration_secs': 0.265567} completed successfully. {{(pid=59857) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 625.693725] env[59857]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Created VM on the ESX host {{(pid=59857) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 625.694527] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/4a4a4830-1ff7-4cff-ab75-d665942f46b5" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 625.694752] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Acquired lock "[datastore2] devstack-image-cache_base/4a4a4830-1ff7-4cff-ab75-d665942f46b5" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 625.695105] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/4a4a4830-1ff7-4cff-ab75-d665942f46b5" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 625.697449] env[59857]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-13ebd517-4a80-438c-b2fc-e67c69fa3cb2 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.703455] env[59857]: DEBUG oslo_vmware.api [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Waiting for the task: (returnval){ [ 625.703455] env[59857]: value = "session[5271f050-9b9a-180d-7d1b-498ca49bf02c]528c2144-7574-7e9f-da3d-96e44017627c" [ 625.703455] env[59857]: _type = "Task" [ 625.703455] env[59857]: } to complete. {{(pid=59857) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 625.713505] env[59857]: DEBUG oslo_vmware.api [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Task: {'id': session[5271f050-9b9a-180d-7d1b-498ca49bf02c]528c2144-7574-7e9f-da3d-96e44017627c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59857) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 626.225358] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Releasing lock "[datastore2] devstack-image-cache_base/4a4a4830-1ff7-4cff-ab75-d665942f46b5" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 626.225554] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Processing image 4a4a4830-1ff7-4cff-ab75-d665942f46b5 {{(pid=59857) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 626.227563] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/4a4a4830-1ff7-4cff-ab75-d665942f46b5/4a4a4830-1ff7-4cff-ab75-d665942f46b5.vmdk" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 626.852150] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 626.852150] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 626.852150] env[59857]: DEBUG nova.compute.manager [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Starting heal instance info cache {{(pid=59857) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 626.852150] env[59857]: DEBUG nova.compute.manager [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Rebuilding the list of instances to heal {{(pid=59857) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 626.867554] env[59857]: DEBUG nova.compute.manager [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] Skipping network cache update for instance because it is Building. {{(pid=59857) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 626.867722] env[59857]: DEBUG nova.compute.manager [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] Skipping network cache update for instance because it is Building. {{(pid=59857) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 626.867914] env[59857]: DEBUG nova.compute.manager [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Skipping network cache update for instance because it is Building. {{(pid=59857) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 626.868037] env[59857]: DEBUG nova.compute.manager [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Skipping network cache update for instance because it is Building. {{(pid=59857) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 626.868165] env[59857]: DEBUG nova.compute.manager [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Didn't find any instances for network info cache update. {{(pid=59857) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 626.870278] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 626.870527] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 626.870714] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 626.870981] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 626.871361] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 626.871427] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 626.871881] env[59857]: DEBUG nova.compute.manager [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59857) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 626.871881] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager.update_available_resource {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 626.885164] env[59857]: DEBUG oslo_concurrency.lockutils [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 626.885164] env[59857]: DEBUG oslo_concurrency.lockutils [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 626.885164] env[59857]: DEBUG oslo_concurrency.lockutils [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 626.885164] env[59857]: DEBUG nova.compute.resource_tracker [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59857) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 626.887689] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c34b920-e284-4932-a25f-671742be3043 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 626.897676] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6dbcc8f5-9758-49a7-aaf1-f50b2d85be46 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 626.912946] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6b13f87-5e1f-42c6-ad8d-18d9facf502f {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 626.919820] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d155f896-7649-4cad-af6f-fe7a500e1540 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 626.951696] env[59857]: DEBUG nova.compute.resource_tracker [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181511MB free_disk=154GB free_vcpus=48 pci_devices=None {{(pid=59857) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 626.951892] env[59857]: DEBUG oslo_concurrency.lockutils [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 626.951892] env[59857]: DEBUG oslo_concurrency.lockutils [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 627.016963] env[59857]: DEBUG nova.compute.resource_tracker [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Instance 21e37459-3ce1-41e7-8317-b98edafb15d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59857) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 627.016963] env[59857]: DEBUG nova.compute.resource_tracker [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Instance b7ab8792-137d-4053-9df9-3d560aa5e411 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59857) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 627.016963] env[59857]: DEBUG nova.compute.resource_tracker [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Instance 5c575e05-5a7c-49b8-b914-9b4a4e347bfc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59857) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 627.016963] env[59857]: DEBUG nova.compute.resource_tracker [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Instance 11f468ba-a807-4490-9dd5-58eaad007865 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59857) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 627.017497] env[59857]: DEBUG nova.compute.resource_tracker [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Total usable vcpus: 48, total allocated vcpus: 4 {{(pid=59857) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 627.017497] env[59857]: DEBUG nova.compute.resource_tracker [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1024MB phys_disk=200GB used_disk=4GB total_vcpus=48 used_vcpus=4 pci_stats=[] {{(pid=59857) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 627.105012] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a09c1b3-4d25-4d7e-b173-d12574e7433c {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.112298] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3138aa1-74a3-4687-b7cc-7472b5a3847f {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.149979] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c6991b4-c576-4a89-8ee7-bb4f0a0eb8e9 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.159113] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f4236dab-2957-4080-8619-2a1ffdc1aaf0 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.173907] env[59857]: DEBUG nova.compute.provider_tree [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 627.193865] env[59857]: DEBUG nova.scheduler.client.report [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 627.216204] env[59857]: DEBUG nova.compute.resource_tracker [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59857) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 627.216204] env[59857]: DEBUG oslo_concurrency.lockutils [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.264s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 627.469339] env[59857]: DEBUG oslo_concurrency.lockutils [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Acquiring lock "fddaa828-8da4-4d5d-80d5-484ccf2ab6b8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 627.469511] env[59857]: DEBUG oslo_concurrency.lockutils [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Lock "fddaa828-8da4-4d5d-80d5-484ccf2ab6b8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 627.480511] env[59857]: DEBUG nova.compute.manager [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] Starting instance... {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 627.550342] env[59857]: DEBUG oslo_concurrency.lockutils [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 627.554326] env[59857]: DEBUG oslo_concurrency.lockutils [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 627.554326] env[59857]: INFO nova.compute.claims [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 627.733146] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e45d0ad-d537-4923-919e-c5f5252972b9 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.743925] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-123c00d1-9b4c-49ae-9132-343f27ade36d {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.776991] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a447ee0-dea2-4a34-9967-82f6c4d33a75 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.785881] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac8a7087-ad94-439c-a7b5-6596c2594871 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.806960] env[59857]: DEBUG nova.compute.provider_tree [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 627.823497] env[59857]: DEBUG nova.scheduler.client.report [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 627.837055] env[59857]: DEBUG oslo_concurrency.lockutils [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.286s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 627.837560] env[59857]: DEBUG nova.compute.manager [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] Start building networks asynchronously for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 627.881020] env[59857]: DEBUG nova.compute.utils [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Using /dev/sd instead of None {{(pid=59857) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 627.881020] env[59857]: DEBUG nova.compute.manager [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] Allocating IP information in the background. {{(pid=59857) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 627.881020] env[59857]: DEBUG nova.network.neutron [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] allocate_for_instance() {{(pid=59857) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 627.899438] env[59857]: DEBUG nova.compute.manager [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] Start building block device mappings for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 627.970598] env[59857]: DEBUG nova.compute.manager [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] Start spawning the instance on the hypervisor. {{(pid=59857) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 627.995944] env[59857]: DEBUG nova.virt.hardware [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-23T14:43:14Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-23T14:42:57Z,direct_url=,disk_format='vmdk',id=4a4a4830-1ff7-4cff-ab75-d665942f46b5,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='88755ffc9d394d14928f0692771e61f5',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-23T14:42:58Z,virtual_size=,visibility=), allow threads: False {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 627.996325] env[59857]: DEBUG nova.virt.hardware [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Flavor limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 627.996325] env[59857]: DEBUG nova.virt.hardware [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Image limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 627.996506] env[59857]: DEBUG nova.virt.hardware [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Flavor pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 627.996646] env[59857]: DEBUG nova.virt.hardware [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Image pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 627.996787] env[59857]: DEBUG nova.virt.hardware [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 627.997113] env[59857]: DEBUG nova.virt.hardware [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 627.997180] env[59857]: DEBUG nova.virt.hardware [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 627.997307] env[59857]: DEBUG nova.virt.hardware [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Got 1 possible topologies {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 627.997600] env[59857]: DEBUG nova.virt.hardware [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 627.997679] env[59857]: DEBUG nova.virt.hardware [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 627.998523] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59a74762-132b-449c-9a31-247aaa233f3c {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 628.006876] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b66fb87-975d-4cd2-a5e2-d11baa900bb3 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 628.234907] env[59857]: DEBUG nova.policy [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9d11f36aa5714e778b9b89d84d55f3b8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1951f850b8e14cf783d324d2842664b8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59857) authorize /opt/stack/nova/nova/policy.py:203}} [ 629.965127] env[59857]: DEBUG nova.network.neutron [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] Successfully created port: d997de27-f617-40ab-a1b9-b240dae29fb1 {{(pid=59857) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 631.124554] env[59857]: DEBUG oslo_concurrency.lockutils [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Acquiring lock "a5eb8727-e918-4a86-9e40-fe20817ca13c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 631.124931] env[59857]: DEBUG oslo_concurrency.lockutils [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Lock "a5eb8727-e918-4a86-9e40-fe20817ca13c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 631.137456] env[59857]: DEBUG nova.compute.manager [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] Starting instance... {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 631.197436] env[59857]: DEBUG oslo_concurrency.lockutils [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 631.197686] env[59857]: DEBUG oslo_concurrency.lockutils [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 631.199244] env[59857]: INFO nova.compute.claims [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 631.387675] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f116ae6-5a48-4636-b15a-f304ba92a56e {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 631.396205] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20753aeb-50cf-45a3-9b6b-865e27c90b2f {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 631.428468] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d5f920b-6b4b-4eb1-a0a6-cb873b944a7f {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 631.436144] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0b258a1-3a80-4a87-96a9-33c893841f99 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 631.450848] env[59857]: DEBUG nova.compute.provider_tree [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 631.464013] env[59857]: DEBUG nova.scheduler.client.report [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 631.483512] env[59857]: DEBUG oslo_concurrency.lockutils [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.286s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 631.483979] env[59857]: DEBUG nova.compute.manager [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] Start building networks asynchronously for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 631.527716] env[59857]: DEBUG nova.compute.utils [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Using /dev/sd instead of None {{(pid=59857) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 631.531214] env[59857]: DEBUG nova.compute.manager [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] Allocating IP information in the background. {{(pid=59857) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 631.531399] env[59857]: DEBUG nova.network.neutron [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] allocate_for_instance() {{(pid=59857) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 631.543588] env[59857]: DEBUG nova.compute.manager [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] Start building block device mappings for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 631.650117] env[59857]: DEBUG nova.compute.manager [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] Start spawning the instance on the hypervisor. {{(pid=59857) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 631.685361] env[59857]: DEBUG nova.virt.hardware [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-23T14:43:14Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-23T14:42:57Z,direct_url=,disk_format='vmdk',id=4a4a4830-1ff7-4cff-ab75-d665942f46b5,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='88755ffc9d394d14928f0692771e61f5',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-23T14:42:58Z,virtual_size=,visibility=), allow threads: False {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 631.685361] env[59857]: DEBUG nova.virt.hardware [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Flavor limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 631.685361] env[59857]: DEBUG nova.virt.hardware [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Image limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 631.685524] env[59857]: DEBUG nova.virt.hardware [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Flavor pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 631.685524] env[59857]: DEBUG nova.virt.hardware [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Image pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 631.685524] env[59857]: DEBUG nova.virt.hardware [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 631.685620] env[59857]: DEBUG nova.virt.hardware [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 631.685792] env[59857]: DEBUG nova.virt.hardware [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 631.685919] env[59857]: DEBUG nova.virt.hardware [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Got 1 possible topologies {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 631.686106] env[59857]: DEBUG nova.virt.hardware [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 631.686275] env[59857]: DEBUG nova.virt.hardware [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 631.687434] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-075b9805-3804-4f10-94a7-7574059763d9 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 631.696769] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b530a03b-8aca-4818-9746-f66e915fd7dc {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 631.766805] env[59857]: DEBUG nova.policy [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4aaadc72c029484cb6f29af2622ebe85', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '38e80b51cffa497d967daa587f7880af', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59857) authorize /opt/stack/nova/nova/policy.py:203}} [ 632.727340] env[59857]: DEBUG nova.network.neutron [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] Successfully created port: 241b104a-67d3-4121-896b-95c8e8ec95a2 {{(pid=59857) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 634.964304] env[59857]: ERROR nova.compute.manager [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port d997de27-f617-40ab-a1b9-b240dae29fb1, please check neutron logs for more information. [ 634.964304] env[59857]: ERROR nova.compute.manager Traceback (most recent call last): [ 634.964304] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 634.964304] env[59857]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 634.964304] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 634.964304] env[59857]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 634.964304] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 634.964304] env[59857]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 634.964304] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 634.964304] env[59857]: ERROR nova.compute.manager self.force_reraise() [ 634.964304] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 634.964304] env[59857]: ERROR nova.compute.manager raise self.value [ 634.964304] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 634.964304] env[59857]: ERROR nova.compute.manager updated_port = self._update_port( [ 634.964304] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 634.964304] env[59857]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 634.964979] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 634.964979] env[59857]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 634.964979] env[59857]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port d997de27-f617-40ab-a1b9-b240dae29fb1, please check neutron logs for more information. [ 634.964979] env[59857]: ERROR nova.compute.manager [ 634.964979] env[59857]: Traceback (most recent call last): [ 634.964979] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 634.964979] env[59857]: listener.cb(fileno) [ 634.964979] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 634.964979] env[59857]: result = function(*args, **kwargs) [ 634.964979] env[59857]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 634.964979] env[59857]: return func(*args, **kwargs) [ 634.964979] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 634.964979] env[59857]: raise e [ 634.964979] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 634.964979] env[59857]: nwinfo = self.network_api.allocate_for_instance( [ 634.964979] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 634.964979] env[59857]: created_port_ids = self._update_ports_for_instance( [ 634.964979] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 634.964979] env[59857]: with excutils.save_and_reraise_exception(): [ 634.964979] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 634.964979] env[59857]: self.force_reraise() [ 634.964979] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 634.964979] env[59857]: raise self.value [ 634.964979] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 634.964979] env[59857]: updated_port = self._update_port( [ 634.964979] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 634.964979] env[59857]: _ensure_no_port_binding_failure(port) [ 634.964979] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 634.964979] env[59857]: raise exception.PortBindingFailed(port_id=port['id']) [ 634.965845] env[59857]: nova.exception.PortBindingFailed: Binding failed for port d997de27-f617-40ab-a1b9-b240dae29fb1, please check neutron logs for more information. [ 634.965845] env[59857]: Removing descriptor: 15 [ 634.965845] env[59857]: ERROR nova.compute.manager [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port d997de27-f617-40ab-a1b9-b240dae29fb1, please check neutron logs for more information. [ 634.965845] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] Traceback (most recent call last): [ 634.965845] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 634.965845] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] yield resources [ 634.965845] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 634.965845] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] self.driver.spawn(context, instance, image_meta, [ 634.965845] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 634.965845] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 634.965845] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 634.965845] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] vm_ref = self.build_virtual_machine(instance, [ 634.966449] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 634.966449] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] vif_infos = vmwarevif.get_vif_info(self._session, [ 634.966449] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 634.966449] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] for vif in network_info: [ 634.966449] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 634.966449] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] return self._sync_wrapper(fn, *args, **kwargs) [ 634.966449] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 634.966449] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] self.wait() [ 634.966449] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 634.966449] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] self[:] = self._gt.wait() [ 634.966449] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 634.966449] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] return self._exit_event.wait() [ 634.966449] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 634.966449] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] result = hub.switch() [ 634.966874] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 634.966874] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] return self.greenlet.switch() [ 634.966874] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 634.966874] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] result = function(*args, **kwargs) [ 634.966874] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 634.966874] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] return func(*args, **kwargs) [ 634.966874] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 634.966874] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] raise e [ 634.966874] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 634.966874] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] nwinfo = self.network_api.allocate_for_instance( [ 634.966874] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 634.966874] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] created_port_ids = self._update_ports_for_instance( [ 634.966874] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 634.967268] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] with excutils.save_and_reraise_exception(): [ 634.967268] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 634.967268] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] self.force_reraise() [ 634.967268] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 634.967268] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] raise self.value [ 634.967268] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 634.967268] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] updated_port = self._update_port( [ 634.967268] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 634.967268] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] _ensure_no_port_binding_failure(port) [ 634.967268] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 634.967268] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] raise exception.PortBindingFailed(port_id=port['id']) [ 634.967268] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] nova.exception.PortBindingFailed: Binding failed for port d997de27-f617-40ab-a1b9-b240dae29fb1, please check neutron logs for more information. [ 634.967268] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] [ 634.967654] env[59857]: INFO nova.compute.manager [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] Terminating instance [ 634.969192] env[59857]: DEBUG oslo_concurrency.lockutils [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Acquiring lock "refresh_cache-fddaa828-8da4-4d5d-80d5-484ccf2ab6b8" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 634.969343] env[59857]: DEBUG oslo_concurrency.lockutils [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Acquired lock "refresh_cache-fddaa828-8da4-4d5d-80d5-484ccf2ab6b8" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 634.969501] env[59857]: DEBUG nova.network.neutron [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 635.041918] env[59857]: DEBUG nova.network.neutron [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 635.555445] env[59857]: DEBUG nova.network.neutron [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 635.572169] env[59857]: DEBUG oslo_concurrency.lockutils [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Releasing lock "refresh_cache-fddaa828-8da4-4d5d-80d5-484ccf2ab6b8" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 635.572789] env[59857]: DEBUG nova.compute.manager [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] Start destroying the instance on the hypervisor. {{(pid=59857) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 635.573145] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] Destroying instance {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 635.573618] env[59857]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-89bd3f29-f762-41ed-9232-977292c440d5 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 635.593021] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f83ee728-0bb4-4daf-bbec-adf0763fabad {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 635.619138] env[59857]: WARNING nova.virt.vmwareapi.vmops [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance fddaa828-8da4-4d5d-80d5-484ccf2ab6b8 could not be found. [ 635.619138] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] Instance destroyed {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 635.619606] env[59857]: INFO nova.compute.manager [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] Took 0.05 seconds to destroy the instance on the hypervisor. [ 635.619606] env[59857]: DEBUG oslo.service.loopingcall [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59857) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 635.620085] env[59857]: DEBUG nova.compute.manager [-] [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 635.621765] env[59857]: DEBUG nova.network.neutron [-] [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 635.762859] env[59857]: DEBUG nova.network.neutron [-] [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 635.772946] env[59857]: DEBUG nova.network.neutron [-] [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 635.786287] env[59857]: INFO nova.compute.manager [-] [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] Took 0.17 seconds to deallocate network for instance. [ 635.791110] env[59857]: DEBUG nova.compute.claims [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] Aborting claim: {{(pid=59857) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 635.791306] env[59857]: DEBUG oslo_concurrency.lockutils [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 635.791519] env[59857]: DEBUG oslo_concurrency.lockutils [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 635.960607] env[59857]: DEBUG oslo_concurrency.lockutils [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Acquiring lock "34ac1a4c-c729-458f-853f-593e0c935f4c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 635.961033] env[59857]: DEBUG oslo_concurrency.lockutils [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Lock "34ac1a4c-c729-458f-853f-593e0c935f4c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 636.000871] env[59857]: DEBUG nova.compute.manager [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] Starting instance... {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 636.023369] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e11c60f4-0c27-4813-b9fc-b779b456a0fe {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 636.032034] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a382f896-13c5-4056-b78a-37f564546276 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 636.065698] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b9d9bfd-10f0-4655-b650-e9dcfb8c31be {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 636.081644] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc56e58d-17a6-4144-b83b-60812cefc44d {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 636.097824] env[59857]: DEBUG nova.compute.provider_tree [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 636.100396] env[59857]: DEBUG oslo_concurrency.lockutils [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 636.106013] env[59857]: DEBUG nova.scheduler.client.report [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 636.121696] env[59857]: DEBUG oslo_concurrency.lockutils [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.330s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 636.122641] env[59857]: ERROR nova.compute.manager [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port d997de27-f617-40ab-a1b9-b240dae29fb1, please check neutron logs for more information. [ 636.122641] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] Traceback (most recent call last): [ 636.122641] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 636.122641] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] self.driver.spawn(context, instance, image_meta, [ 636.122641] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 636.122641] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 636.122641] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 636.122641] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] vm_ref = self.build_virtual_machine(instance, [ 636.122641] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 636.122641] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] vif_infos = vmwarevif.get_vif_info(self._session, [ 636.122641] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 636.122994] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] for vif in network_info: [ 636.122994] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 636.122994] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] return self._sync_wrapper(fn, *args, **kwargs) [ 636.122994] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 636.122994] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] self.wait() [ 636.122994] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 636.122994] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] self[:] = self._gt.wait() [ 636.122994] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 636.122994] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] return self._exit_event.wait() [ 636.122994] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 636.122994] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] result = hub.switch() [ 636.122994] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 636.122994] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] return self.greenlet.switch() [ 636.122994] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 636.123409] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] result = function(*args, **kwargs) [ 636.123409] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 636.123409] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] return func(*args, **kwargs) [ 636.123409] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 636.123409] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] raise e [ 636.123409] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 636.123409] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] nwinfo = self.network_api.allocate_for_instance( [ 636.123409] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 636.123409] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] created_port_ids = self._update_ports_for_instance( [ 636.123409] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 636.123409] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] with excutils.save_and_reraise_exception(): [ 636.123409] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 636.123409] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] self.force_reraise() [ 636.123712] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 636.123712] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] raise self.value [ 636.123712] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 636.123712] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] updated_port = self._update_port( [ 636.123712] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 636.123712] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] _ensure_no_port_binding_failure(port) [ 636.123712] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 636.123712] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] raise exception.PortBindingFailed(port_id=port['id']) [ 636.123712] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] nova.exception.PortBindingFailed: Binding failed for port d997de27-f617-40ab-a1b9-b240dae29fb1, please check neutron logs for more information. [ 636.123712] env[59857]: ERROR nova.compute.manager [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] [ 636.123977] env[59857]: DEBUG nova.compute.utils [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] Binding failed for port d997de27-f617-40ab-a1b9-b240dae29fb1, please check neutron logs for more information. {{(pid=59857) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 636.124707] env[59857]: DEBUG oslo_concurrency.lockutils [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.025s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 636.126129] env[59857]: INFO nova.compute.claims [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 636.128773] env[59857]: DEBUG nova.compute.manager [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] Build of instance fddaa828-8da4-4d5d-80d5-484ccf2ab6b8 was re-scheduled: Binding failed for port d997de27-f617-40ab-a1b9-b240dae29fb1, please check neutron logs for more information. {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 636.129233] env[59857]: DEBUG nova.compute.manager [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] Unplugging VIFs for instance {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 636.129447] env[59857]: DEBUG oslo_concurrency.lockutils [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Acquiring lock "refresh_cache-fddaa828-8da4-4d5d-80d5-484ccf2ab6b8" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 636.129587] env[59857]: DEBUG oslo_concurrency.lockutils [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Acquired lock "refresh_cache-fddaa828-8da4-4d5d-80d5-484ccf2ab6b8" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 636.129742] env[59857]: DEBUG nova.network.neutron [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 636.215351] env[59857]: DEBUG nova.network.neutron [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 636.292797] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a22b130f-2a77-4317-a5c7-888799e2f3f9 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 636.300818] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-089c3ef5-4c50-4557-bb87-44682e3829ab {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 636.334604] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d651e90-caf8-4b8a-ab75-5b604381fef0 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 636.342734] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b41eaee7-518a-426a-a4fb-12a9623879c1 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 636.359144] env[59857]: DEBUG nova.compute.provider_tree [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 636.371022] env[59857]: DEBUG nova.scheduler.client.report [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 636.396673] env[59857]: DEBUG oslo_concurrency.lockutils [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.272s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 636.398038] env[59857]: DEBUG nova.compute.manager [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] Start building networks asynchronously for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 636.441710] env[59857]: DEBUG nova.compute.utils [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Using /dev/sd instead of None {{(pid=59857) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 636.443171] env[59857]: DEBUG nova.compute.manager [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] Allocating IP information in the background. {{(pid=59857) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 636.443309] env[59857]: DEBUG nova.network.neutron [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] allocate_for_instance() {{(pid=59857) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 636.452719] env[59857]: DEBUG nova.compute.manager [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] Start building block device mappings for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 636.531969] env[59857]: DEBUG nova.compute.manager [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] Start spawning the instance on the hypervisor. {{(pid=59857) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 636.558658] env[59857]: DEBUG nova.virt.hardware [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-23T14:43:14Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-23T14:42:57Z,direct_url=,disk_format='vmdk',id=4a4a4830-1ff7-4cff-ab75-d665942f46b5,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='88755ffc9d394d14928f0692771e61f5',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-23T14:42:58Z,virtual_size=,visibility=), allow threads: False {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 636.558890] env[59857]: DEBUG nova.virt.hardware [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Flavor limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 636.559050] env[59857]: DEBUG nova.virt.hardware [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Image limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 636.559228] env[59857]: DEBUG nova.virt.hardware [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Flavor pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 636.559367] env[59857]: DEBUG nova.virt.hardware [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Image pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 636.559505] env[59857]: DEBUG nova.virt.hardware [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 636.559708] env[59857]: DEBUG nova.virt.hardware [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 636.559864] env[59857]: DEBUG nova.virt.hardware [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 636.560036] env[59857]: DEBUG nova.virt.hardware [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Got 1 possible topologies {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 636.560197] env[59857]: DEBUG nova.virt.hardware [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 636.560362] env[59857]: DEBUG nova.virt.hardware [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 636.561395] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc79c80e-aea9-4ad9-9ee8-9a8099c35940 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 636.570026] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1809dd9-3838-4166-b049-b7685d6fa486 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 636.582854] env[59857]: DEBUG nova.policy [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '05f862aa6748451a96b89a90ebbfa4a5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd020d014854949569b330b6a18e177dc', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59857) authorize /opt/stack/nova/nova/policy.py:203}} [ 636.898028] env[59857]: DEBUG nova.network.neutron [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 636.912030] env[59857]: DEBUG oslo_concurrency.lockutils [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Releasing lock "refresh_cache-fddaa828-8da4-4d5d-80d5-484ccf2ab6b8" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 636.912030] env[59857]: DEBUG nova.compute.manager [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 636.912211] env[59857]: DEBUG nova.compute.manager [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 636.912725] env[59857]: DEBUG nova.network.neutron [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 637.159285] env[59857]: DEBUG nova.network.neutron [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 637.174385] env[59857]: DEBUG nova.network.neutron [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 637.187724] env[59857]: INFO nova.compute.manager [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] Took 0.28 seconds to deallocate network for instance. [ 637.311442] env[59857]: INFO nova.scheduler.client.report [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Deleted allocations for instance fddaa828-8da4-4d5d-80d5-484ccf2ab6b8 [ 637.331996] env[59857]: DEBUG oslo_concurrency.lockutils [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Lock "fddaa828-8da4-4d5d-80d5-484ccf2ab6b8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 9.862s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 638.024447] env[59857]: DEBUG nova.network.neutron [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] Successfully created port: 5725e74a-7746-478f-a0b4-0542854b6712 {{(pid=59857) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 639.239224] env[59857]: ERROR nova.compute.manager [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 241b104a-67d3-4121-896b-95c8e8ec95a2, please check neutron logs for more information. [ 639.239224] env[59857]: ERROR nova.compute.manager Traceback (most recent call last): [ 639.239224] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 639.239224] env[59857]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 639.239224] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 639.239224] env[59857]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 639.239224] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 639.239224] env[59857]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 639.239224] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 639.239224] env[59857]: ERROR nova.compute.manager self.force_reraise() [ 639.239224] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 639.239224] env[59857]: ERROR nova.compute.manager raise self.value [ 639.239224] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 639.239224] env[59857]: ERROR nova.compute.manager updated_port = self._update_port( [ 639.239224] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 639.239224] env[59857]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 639.239861] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 639.239861] env[59857]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 639.239861] env[59857]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 241b104a-67d3-4121-896b-95c8e8ec95a2, please check neutron logs for more information. [ 639.239861] env[59857]: ERROR nova.compute.manager [ 639.244030] env[59857]: Traceback (most recent call last): [ 639.244030] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 639.244030] env[59857]: listener.cb(fileno) [ 639.244030] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 639.244030] env[59857]: result = function(*args, **kwargs) [ 639.244030] env[59857]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 639.244030] env[59857]: return func(*args, **kwargs) [ 639.244030] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 639.244030] env[59857]: raise e [ 639.244030] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 639.244030] env[59857]: nwinfo = self.network_api.allocate_for_instance( [ 639.244030] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 639.244030] env[59857]: created_port_ids = self._update_ports_for_instance( [ 639.244030] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 639.244030] env[59857]: with excutils.save_and_reraise_exception(): [ 639.244030] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 639.244030] env[59857]: self.force_reraise() [ 639.244030] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 639.244030] env[59857]: raise self.value [ 639.244030] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 639.244030] env[59857]: updated_port = self._update_port( [ 639.244030] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 639.244030] env[59857]: _ensure_no_port_binding_failure(port) [ 639.244030] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 639.244030] env[59857]: raise exception.PortBindingFailed(port_id=port['id']) [ 639.244030] env[59857]: nova.exception.PortBindingFailed: Binding failed for port 241b104a-67d3-4121-896b-95c8e8ec95a2, please check neutron logs for more information. [ 639.244030] env[59857]: Removing descriptor: 14 [ 639.244829] env[59857]: ERROR nova.compute.manager [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 241b104a-67d3-4121-896b-95c8e8ec95a2, please check neutron logs for more information. [ 639.244829] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] Traceback (most recent call last): [ 639.244829] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 639.244829] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] yield resources [ 639.244829] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 639.244829] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] self.driver.spawn(context, instance, image_meta, [ 639.244829] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 639.244829] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 639.244829] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 639.244829] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] vm_ref = self.build_virtual_machine(instance, [ 639.244829] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 639.245584] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] vif_infos = vmwarevif.get_vif_info(self._session, [ 639.245584] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 639.245584] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] for vif in network_info: [ 639.245584] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 639.245584] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] return self._sync_wrapper(fn, *args, **kwargs) [ 639.245584] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 639.245584] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] self.wait() [ 639.245584] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 639.245584] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] self[:] = self._gt.wait() [ 639.245584] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 639.245584] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] return self._exit_event.wait() [ 639.245584] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 639.245584] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] result = hub.switch() [ 639.245584] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 639.245958] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] return self.greenlet.switch() [ 639.245958] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 639.245958] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] result = function(*args, **kwargs) [ 639.245958] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 639.245958] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] return func(*args, **kwargs) [ 639.245958] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 639.245958] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] raise e [ 639.245958] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 639.245958] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] nwinfo = self.network_api.allocate_for_instance( [ 639.245958] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 639.245958] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] created_port_ids = self._update_ports_for_instance( [ 639.245958] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 639.245958] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] with excutils.save_and_reraise_exception(): [ 639.246327] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 639.246327] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] self.force_reraise() [ 639.246327] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 639.246327] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] raise self.value [ 639.246327] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 639.246327] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] updated_port = self._update_port( [ 639.246327] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 639.246327] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] _ensure_no_port_binding_failure(port) [ 639.246327] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 639.246327] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] raise exception.PortBindingFailed(port_id=port['id']) [ 639.246327] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] nova.exception.PortBindingFailed: Binding failed for port 241b104a-67d3-4121-896b-95c8e8ec95a2, please check neutron logs for more information. [ 639.246327] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] [ 639.246742] env[59857]: INFO nova.compute.manager [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] Terminating instance [ 639.246742] env[59857]: DEBUG oslo_concurrency.lockutils [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Acquiring lock "refresh_cache-a5eb8727-e918-4a86-9e40-fe20817ca13c" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 639.246742] env[59857]: DEBUG oslo_concurrency.lockutils [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Acquired lock "refresh_cache-a5eb8727-e918-4a86-9e40-fe20817ca13c" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 639.246872] env[59857]: DEBUG nova.network.neutron [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 639.475421] env[59857]: DEBUG nova.network.neutron [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 639.707212] env[59857]: DEBUG nova.network.neutron [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 639.720674] env[59857]: DEBUG oslo_concurrency.lockutils [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Releasing lock "refresh_cache-a5eb8727-e918-4a86-9e40-fe20817ca13c" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 639.721090] env[59857]: DEBUG nova.compute.manager [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] Start destroying the instance on the hypervisor. {{(pid=59857) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 639.721274] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] Destroying instance {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 639.721815] env[59857]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-765d3cb1-f38e-4a51-b072-c13c7149d812 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.738786] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c5d1a33-903b-4f1c-93a7-86903795c1d2 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.774661] env[59857]: WARNING nova.virt.vmwareapi.vmops [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance a5eb8727-e918-4a86-9e40-fe20817ca13c could not be found. [ 639.774661] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] Instance destroyed {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 639.774661] env[59857]: INFO nova.compute.manager [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] Took 0.05 seconds to destroy the instance on the hypervisor. [ 639.774661] env[59857]: DEBUG oslo.service.loopingcall [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59857) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 639.774661] env[59857]: DEBUG nova.compute.manager [-] [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 639.774827] env[59857]: DEBUG nova.network.neutron [-] [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 639.814070] env[59857]: DEBUG nova.network.neutron [-] [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 639.821922] env[59857]: DEBUG nova.network.neutron [-] [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 639.839260] env[59857]: INFO nova.compute.manager [-] [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] Took 0.06 seconds to deallocate network for instance. [ 639.839260] env[59857]: DEBUG nova.compute.claims [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] Aborting claim: {{(pid=59857) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 639.839260] env[59857]: DEBUG oslo_concurrency.lockutils [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 639.839260] env[59857]: DEBUG oslo_concurrency.lockutils [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 640.030747] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74e923ba-1c42-477f-9e10-19c3bfead336 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 640.041858] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1a70f22-9a63-4823-a0a8-fc3ef361caa9 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 640.080220] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99cf3c27-7d40-47b4-a1e0-f308ee65f0ac {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 640.088159] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba897080-a4a2-4ff7-930d-e8871d5d6fed {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 640.106174] env[59857]: DEBUG nova.compute.provider_tree [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 640.116270] env[59857]: DEBUG nova.scheduler.client.report [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 640.133576] env[59857]: DEBUG oslo_concurrency.lockutils [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.295s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 640.134240] env[59857]: ERROR nova.compute.manager [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 241b104a-67d3-4121-896b-95c8e8ec95a2, please check neutron logs for more information. [ 640.134240] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] Traceback (most recent call last): [ 640.134240] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 640.134240] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] self.driver.spawn(context, instance, image_meta, [ 640.134240] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 640.134240] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 640.134240] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 640.134240] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] vm_ref = self.build_virtual_machine(instance, [ 640.134240] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 640.134240] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] vif_infos = vmwarevif.get_vif_info(self._session, [ 640.134240] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 640.134607] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] for vif in network_info: [ 640.134607] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 640.134607] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] return self._sync_wrapper(fn, *args, **kwargs) [ 640.134607] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 640.134607] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] self.wait() [ 640.134607] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 640.134607] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] self[:] = self._gt.wait() [ 640.134607] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 640.134607] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] return self._exit_event.wait() [ 640.134607] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 640.134607] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] result = hub.switch() [ 640.134607] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 640.134607] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] return self.greenlet.switch() [ 640.134607] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 640.134978] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] result = function(*args, **kwargs) [ 640.134978] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 640.134978] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] return func(*args, **kwargs) [ 640.134978] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 640.134978] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] raise e [ 640.134978] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 640.134978] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] nwinfo = self.network_api.allocate_for_instance( [ 640.134978] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 640.134978] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] created_port_ids = self._update_ports_for_instance( [ 640.134978] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 640.134978] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] with excutils.save_and_reraise_exception(): [ 640.134978] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 640.134978] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] self.force_reraise() [ 640.135293] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 640.135293] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] raise self.value [ 640.135293] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 640.135293] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] updated_port = self._update_port( [ 640.135293] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 640.135293] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] _ensure_no_port_binding_failure(port) [ 640.135293] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 640.135293] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] raise exception.PortBindingFailed(port_id=port['id']) [ 640.135293] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] nova.exception.PortBindingFailed: Binding failed for port 241b104a-67d3-4121-896b-95c8e8ec95a2, please check neutron logs for more information. [ 640.135293] env[59857]: ERROR nova.compute.manager [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] [ 640.135293] env[59857]: DEBUG nova.compute.utils [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] Binding failed for port 241b104a-67d3-4121-896b-95c8e8ec95a2, please check neutron logs for more information. {{(pid=59857) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 640.138138] env[59857]: DEBUG nova.compute.manager [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] Build of instance a5eb8727-e918-4a86-9e40-fe20817ca13c was re-scheduled: Binding failed for port 241b104a-67d3-4121-896b-95c8e8ec95a2, please check neutron logs for more information. {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 640.138958] env[59857]: DEBUG nova.compute.manager [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] Unplugging VIFs for instance {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 640.138958] env[59857]: DEBUG oslo_concurrency.lockutils [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Acquiring lock "refresh_cache-a5eb8727-e918-4a86-9e40-fe20817ca13c" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 640.138958] env[59857]: DEBUG oslo_concurrency.lockutils [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Acquired lock "refresh_cache-a5eb8727-e918-4a86-9e40-fe20817ca13c" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 640.139133] env[59857]: DEBUG nova.network.neutron [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 640.203525] env[59857]: DEBUG nova.network.neutron [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 640.611829] env[59857]: DEBUG nova.network.neutron [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 640.620356] env[59857]: DEBUG oslo_concurrency.lockutils [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Releasing lock "refresh_cache-a5eb8727-e918-4a86-9e40-fe20817ca13c" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 640.620522] env[59857]: DEBUG nova.compute.manager [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 640.620675] env[59857]: DEBUG nova.compute.manager [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 640.620839] env[59857]: DEBUG nova.network.neutron [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 640.679907] env[59857]: DEBUG nova.network.neutron [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 640.692789] env[59857]: DEBUG nova.network.neutron [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 640.704952] env[59857]: INFO nova.compute.manager [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] Took 0.08 seconds to deallocate network for instance. [ 640.858662] env[59857]: INFO nova.scheduler.client.report [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Deleted allocations for instance a5eb8727-e918-4a86-9e40-fe20817ca13c [ 640.879147] env[59857]: DEBUG oslo_concurrency.lockutils [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Lock "a5eb8727-e918-4a86-9e40-fe20817ca13c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 9.754s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 643.240512] env[59857]: ERROR nova.compute.manager [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 5725e74a-7746-478f-a0b4-0542854b6712, please check neutron logs for more information. [ 643.240512] env[59857]: ERROR nova.compute.manager Traceback (most recent call last): [ 643.240512] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 643.240512] env[59857]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 643.240512] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 643.240512] env[59857]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 643.240512] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 643.240512] env[59857]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 643.240512] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 643.240512] env[59857]: ERROR nova.compute.manager self.force_reraise() [ 643.240512] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 643.240512] env[59857]: ERROR nova.compute.manager raise self.value [ 643.240512] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 643.240512] env[59857]: ERROR nova.compute.manager updated_port = self._update_port( [ 643.240512] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 643.240512] env[59857]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 643.241430] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 643.241430] env[59857]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 643.241430] env[59857]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 5725e74a-7746-478f-a0b4-0542854b6712, please check neutron logs for more information. [ 643.241430] env[59857]: ERROR nova.compute.manager [ 643.241430] env[59857]: Traceback (most recent call last): [ 643.241430] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 643.241430] env[59857]: listener.cb(fileno) [ 643.241430] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 643.241430] env[59857]: result = function(*args, **kwargs) [ 643.241430] env[59857]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 643.241430] env[59857]: return func(*args, **kwargs) [ 643.241430] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 643.241430] env[59857]: raise e [ 643.241430] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 643.241430] env[59857]: nwinfo = self.network_api.allocate_for_instance( [ 643.241430] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 643.241430] env[59857]: created_port_ids = self._update_ports_for_instance( [ 643.241430] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 643.241430] env[59857]: with excutils.save_and_reraise_exception(): [ 643.241430] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 643.241430] env[59857]: self.force_reraise() [ 643.241430] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 643.241430] env[59857]: raise self.value [ 643.241430] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 643.241430] env[59857]: updated_port = self._update_port( [ 643.241430] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 643.241430] env[59857]: _ensure_no_port_binding_failure(port) [ 643.241430] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 643.241430] env[59857]: raise exception.PortBindingFailed(port_id=port['id']) [ 643.242448] env[59857]: nova.exception.PortBindingFailed: Binding failed for port 5725e74a-7746-478f-a0b4-0542854b6712, please check neutron logs for more information. [ 643.242448] env[59857]: Removing descriptor: 17 [ 643.242448] env[59857]: ERROR nova.compute.manager [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 5725e74a-7746-478f-a0b4-0542854b6712, please check neutron logs for more information. [ 643.242448] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] Traceback (most recent call last): [ 643.242448] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 643.242448] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] yield resources [ 643.242448] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 643.242448] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] self.driver.spawn(context, instance, image_meta, [ 643.242448] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 643.242448] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 643.242448] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 643.242448] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] vm_ref = self.build_virtual_machine(instance, [ 643.242752] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 643.242752] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] vif_infos = vmwarevif.get_vif_info(self._session, [ 643.242752] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 643.242752] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] for vif in network_info: [ 643.242752] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 643.242752] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] return self._sync_wrapper(fn, *args, **kwargs) [ 643.242752] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 643.242752] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] self.wait() [ 643.242752] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 643.242752] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] self[:] = self._gt.wait() [ 643.242752] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 643.242752] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] return self._exit_event.wait() [ 643.242752] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 643.242752] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] result = hub.switch() [ 643.243493] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 643.243493] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] return self.greenlet.switch() [ 643.243493] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 643.243493] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] result = function(*args, **kwargs) [ 643.243493] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 643.243493] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] return func(*args, **kwargs) [ 643.243493] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 643.243493] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] raise e [ 643.243493] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 643.243493] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] nwinfo = self.network_api.allocate_for_instance( [ 643.243493] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 643.243493] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] created_port_ids = self._update_ports_for_instance( [ 643.243493] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 643.244640] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] with excutils.save_and_reraise_exception(): [ 643.244640] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 643.244640] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] self.force_reraise() [ 643.244640] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 643.244640] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] raise self.value [ 643.244640] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 643.244640] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] updated_port = self._update_port( [ 643.244640] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 643.244640] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] _ensure_no_port_binding_failure(port) [ 643.244640] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 643.244640] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] raise exception.PortBindingFailed(port_id=port['id']) [ 643.244640] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] nova.exception.PortBindingFailed: Binding failed for port 5725e74a-7746-478f-a0b4-0542854b6712, please check neutron logs for more information. [ 643.244640] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] [ 643.245142] env[59857]: INFO nova.compute.manager [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] Terminating instance [ 643.245142] env[59857]: DEBUG oslo_concurrency.lockutils [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Acquiring lock "refresh_cache-34ac1a4c-c729-458f-853f-593e0c935f4c" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 643.245142] env[59857]: DEBUG oslo_concurrency.lockutils [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Acquired lock "refresh_cache-34ac1a4c-c729-458f-853f-593e0c935f4c" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 643.245263] env[59857]: DEBUG nova.network.neutron [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 643.286216] env[59857]: DEBUG nova.network.neutron [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 643.546428] env[59857]: DEBUG nova.network.neutron [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 643.562441] env[59857]: DEBUG oslo_concurrency.lockutils [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Releasing lock "refresh_cache-34ac1a4c-c729-458f-853f-593e0c935f4c" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 643.563151] env[59857]: DEBUG nova.compute.manager [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] Start destroying the instance on the hypervisor. {{(pid=59857) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 643.563151] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] Destroying instance {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 643.563571] env[59857]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-76e7ebc3-c9c9-446b-8b26-1028f91835fd {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 643.579558] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89a15e96-8e4d-4dbe-9ecf-c9af4478a44b {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 643.614226] env[59857]: WARNING nova.virt.vmwareapi.vmops [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 34ac1a4c-c729-458f-853f-593e0c935f4c could not be found. [ 643.616090] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] Instance destroyed {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 643.616090] env[59857]: INFO nova.compute.manager [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] Took 0.05 seconds to destroy the instance on the hypervisor. [ 643.616090] env[59857]: DEBUG oslo.service.loopingcall [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59857) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 643.616090] env[59857]: DEBUG nova.compute.manager [-] [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 643.616090] env[59857]: DEBUG nova.network.neutron [-] [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 643.654085] env[59857]: DEBUG nova.network.neutron [-] [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 643.665499] env[59857]: DEBUG nova.network.neutron [-] [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 643.680112] env[59857]: INFO nova.compute.manager [-] [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] Took 0.06 seconds to deallocate network for instance. [ 643.681591] env[59857]: DEBUG nova.compute.claims [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] Aborting claim: {{(pid=59857) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 643.681761] env[59857]: DEBUG oslo_concurrency.lockutils [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 643.681963] env[59857]: DEBUG oslo_concurrency.lockutils [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 643.829989] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c8b6752-7b76-4ef6-be04-20125f7aa04a {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 643.840458] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b71fb4e-8417-4c8d-be97-9f432ee1267c {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 643.874399] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82a6801b-2f1c-4de2-8fa1-391d7fa896e1 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 643.883216] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17275044-bb81-43aa-8912-7378fed8653c {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 643.898009] env[59857]: DEBUG nova.compute.provider_tree [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 643.908261] env[59857]: DEBUG nova.scheduler.client.report [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 643.923967] env[59857]: DEBUG oslo_concurrency.lockutils [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.242s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 643.925204] env[59857]: ERROR nova.compute.manager [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 5725e74a-7746-478f-a0b4-0542854b6712, please check neutron logs for more information. [ 643.925204] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] Traceback (most recent call last): [ 643.925204] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 643.925204] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] self.driver.spawn(context, instance, image_meta, [ 643.925204] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 643.925204] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 643.925204] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 643.925204] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] vm_ref = self.build_virtual_machine(instance, [ 643.925204] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 643.925204] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] vif_infos = vmwarevif.get_vif_info(self._session, [ 643.925204] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 643.925576] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] for vif in network_info: [ 643.925576] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 643.925576] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] return self._sync_wrapper(fn, *args, **kwargs) [ 643.925576] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 643.925576] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] self.wait() [ 643.925576] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 643.925576] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] self[:] = self._gt.wait() [ 643.925576] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 643.925576] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] return self._exit_event.wait() [ 643.925576] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 643.925576] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] result = hub.switch() [ 643.925576] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 643.925576] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] return self.greenlet.switch() [ 643.925576] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 643.925913] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] result = function(*args, **kwargs) [ 643.925913] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 643.925913] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] return func(*args, **kwargs) [ 643.925913] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 643.925913] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] raise e [ 643.925913] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 643.925913] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] nwinfo = self.network_api.allocate_for_instance( [ 643.925913] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 643.925913] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] created_port_ids = self._update_ports_for_instance( [ 643.925913] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 643.925913] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] with excutils.save_and_reraise_exception(): [ 643.925913] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 643.925913] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] self.force_reraise() [ 643.926287] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 643.926287] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] raise self.value [ 643.926287] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 643.926287] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] updated_port = self._update_port( [ 643.926287] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 643.926287] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] _ensure_no_port_binding_failure(port) [ 643.926287] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 643.926287] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] raise exception.PortBindingFailed(port_id=port['id']) [ 643.926287] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] nova.exception.PortBindingFailed: Binding failed for port 5725e74a-7746-478f-a0b4-0542854b6712, please check neutron logs for more information. [ 643.926287] env[59857]: ERROR nova.compute.manager [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] [ 643.928302] env[59857]: DEBUG nova.compute.utils [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] Binding failed for port 5725e74a-7746-478f-a0b4-0542854b6712, please check neutron logs for more information. {{(pid=59857) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 643.930198] env[59857]: DEBUG nova.compute.manager [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] Build of instance 34ac1a4c-c729-458f-853f-593e0c935f4c was re-scheduled: Binding failed for port 5725e74a-7746-478f-a0b4-0542854b6712, please check neutron logs for more information. {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 643.930748] env[59857]: DEBUG nova.compute.manager [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] Unplugging VIFs for instance {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 643.931849] env[59857]: DEBUG oslo_concurrency.lockutils [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Acquiring lock "refresh_cache-34ac1a4c-c729-458f-853f-593e0c935f4c" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 643.931849] env[59857]: DEBUG oslo_concurrency.lockutils [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Acquired lock "refresh_cache-34ac1a4c-c729-458f-853f-593e0c935f4c" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 643.931849] env[59857]: DEBUG nova.network.neutron [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 643.974697] env[59857]: DEBUG nova.network.neutron [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 644.155131] env[59857]: DEBUG nova.network.neutron [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 644.166458] env[59857]: DEBUG oslo_concurrency.lockutils [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Releasing lock "refresh_cache-34ac1a4c-c729-458f-853f-593e0c935f4c" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 644.166689] env[59857]: DEBUG nova.compute.manager [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 644.166863] env[59857]: DEBUG nova.compute.manager [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 644.167031] env[59857]: DEBUG nova.network.neutron [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 644.195916] env[59857]: DEBUG nova.network.neutron [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 644.210700] env[59857]: DEBUG nova.network.neutron [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 644.222063] env[59857]: INFO nova.compute.manager [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] Took 0.05 seconds to deallocate network for instance. [ 644.339549] env[59857]: INFO nova.scheduler.client.report [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Deleted allocations for instance 34ac1a4c-c729-458f-853f-593e0c935f4c [ 644.361727] env[59857]: DEBUG oslo_concurrency.lockutils [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Lock "34ac1a4c-c729-458f-853f-593e0c935f4c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 8.401s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 663.865261] env[59857]: WARNING oslo_vmware.rw_handles [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 663.865261] env[59857]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 663.865261] env[59857]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 663.865261] env[59857]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 663.865261] env[59857]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 663.865261] env[59857]: ERROR oslo_vmware.rw_handles response.begin() [ 663.865261] env[59857]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 663.865261] env[59857]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 663.865261] env[59857]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 663.865261] env[59857]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 663.865261] env[59857]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 663.865261] env[59857]: ERROR oslo_vmware.rw_handles [ 663.866163] env[59857]: DEBUG nova.virt.vmwareapi.images [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] Downloaded image file data 4a4a4830-1ff7-4cff-ab75-d665942f46b5 to vmware_temp/ead15260-9aef-4ff4-beda-8a9ffaa2f1bc/4a4a4830-1ff7-4cff-ab75-d665942f46b5/tmp-sparse.vmdk on the data store datastore2 {{(pid=59857) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 663.867138] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] Caching image {{(pid=59857) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 663.867385] env[59857]: DEBUG nova.virt.vmwareapi.vm_util [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Copying Virtual Disk [datastore2] vmware_temp/ead15260-9aef-4ff4-beda-8a9ffaa2f1bc/4a4a4830-1ff7-4cff-ab75-d665942f46b5/tmp-sparse.vmdk to [datastore2] vmware_temp/ead15260-9aef-4ff4-beda-8a9ffaa2f1bc/4a4a4830-1ff7-4cff-ab75-d665942f46b5/4a4a4830-1ff7-4cff-ab75-d665942f46b5.vmdk {{(pid=59857) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 663.868113] env[59857]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-f75bd8eb-b395-4ec0-a207-a199d055c452 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 663.877527] env[59857]: DEBUG oslo_vmware.api [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Waiting for the task: (returnval){ [ 663.877527] env[59857]: value = "task-1341425" [ 663.877527] env[59857]: _type = "Task" [ 663.877527] env[59857]: } to complete. {{(pid=59857) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 663.886897] env[59857]: DEBUG oslo_vmware.api [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Task: {'id': task-1341425, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59857) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 664.393715] env[59857]: DEBUG oslo_vmware.exceptions [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Fault InvalidArgument not matched. {{(pid=59857) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 664.393984] env[59857]: DEBUG oslo_concurrency.lockutils [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Releasing lock "[datastore2] devstack-image-cache_base/4a4a4830-1ff7-4cff-ab75-d665942f46b5/4a4a4830-1ff7-4cff-ab75-d665942f46b5.vmdk" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 664.395110] env[59857]: ERROR nova.compute.manager [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 664.395110] env[59857]: Faults: ['InvalidArgument'] [ 664.395110] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] Traceback (most recent call last): [ 664.395110] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 664.395110] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] yield resources [ 664.395110] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 664.395110] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] self.driver.spawn(context, instance, image_meta, [ 664.395110] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 664.395110] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 664.395110] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 664.395110] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] self._fetch_image_if_missing(context, vi) [ 664.395110] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 664.395579] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] image_cache(vi, tmp_image_ds_loc) [ 664.395579] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 664.395579] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] vm_util.copy_virtual_disk( [ 664.395579] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 664.395579] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] session._wait_for_task(vmdk_copy_task) [ 664.395579] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 664.395579] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] return self.wait_for_task(task_ref) [ 664.395579] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 664.395579] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] return evt.wait() [ 664.395579] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 664.395579] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] result = hub.switch() [ 664.395579] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 664.395579] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] return self.greenlet.switch() [ 664.395978] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 664.395978] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] self.f(*self.args, **self.kw) [ 664.395978] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 664.395978] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] raise exceptions.translate_fault(task_info.error) [ 664.395978] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 664.395978] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] Faults: ['InvalidArgument'] [ 664.395978] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] [ 664.395978] env[59857]: INFO nova.compute.manager [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] Terminating instance [ 664.403324] env[59857]: DEBUG oslo_concurrency.lockutils [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Acquired lock "[datastore2] devstack-image-cache_base/4a4a4830-1ff7-4cff-ab75-d665942f46b5/4a4a4830-1ff7-4cff-ab75-d665942f46b5.vmdk" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 664.403324] env[59857]: DEBUG nova.virt.vmwareapi.ds_util [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=59857) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 664.403324] env[59857]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-271e0514-9d69-4a84-8179-df5fa761709d {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 664.404763] env[59857]: DEBUG oslo_concurrency.lockutils [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Acquiring lock "refresh_cache-21e37459-3ce1-41e7-8317-b98edafb15d4" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 664.404983] env[59857]: DEBUG oslo_concurrency.lockutils [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Acquired lock "refresh_cache-21e37459-3ce1-41e7-8317-b98edafb15d4" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 664.405102] env[59857]: DEBUG nova.network.neutron [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 664.416867] env[59857]: DEBUG nova.virt.vmwareapi.ds_util [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=59857) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 664.416867] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=59857) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 664.419831] env[59857]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b835afac-9332-4da0-b096-6f77c3ac319c {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 664.427667] env[59857]: DEBUG oslo_vmware.api [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Waiting for the task: (returnval){ [ 664.427667] env[59857]: value = "session[5271f050-9b9a-180d-7d1b-498ca49bf02c]52c3be94-bf52-fee6-16b5-4457ec388fea" [ 664.427667] env[59857]: _type = "Task" [ 664.427667] env[59857]: } to complete. {{(pid=59857) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 664.442545] env[59857]: DEBUG oslo_vmware.api [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Task: {'id': session[5271f050-9b9a-180d-7d1b-498ca49bf02c]52c3be94-bf52-fee6-16b5-4457ec388fea, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59857) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 664.498384] env[59857]: DEBUG nova.network.neutron [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 664.834481] env[59857]: DEBUG nova.network.neutron [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 664.849056] env[59857]: DEBUG oslo_concurrency.lockutils [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Releasing lock "refresh_cache-21e37459-3ce1-41e7-8317-b98edafb15d4" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 664.849056] env[59857]: DEBUG nova.compute.manager [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] Start destroying the instance on the hypervisor. {{(pid=59857) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 664.851467] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] Destroying instance {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 664.852866] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6904e73c-7ced-4767-af5f-5a3b2eeaf72c {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 664.861828] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] Unregistering the VM {{(pid=59857) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 664.862080] env[59857]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-3f406cf4-7389-4b07-abbb-dd200bb26177 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 664.892806] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] Unregistered the VM {{(pid=59857) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 664.892806] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] Deleting contents of the VM from datastore datastore2 {{(pid=59857) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 664.892806] env[59857]: DEBUG nova.virt.vmwareapi.ds_util [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Deleting the datastore file [datastore2] 21e37459-3ce1-41e7-8317-b98edafb15d4 {{(pid=59857) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 664.892806] env[59857]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-f11ea2b1-8232-47ad-bf87-6481527506b2 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 664.900435] env[59857]: DEBUG oslo_vmware.api [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Waiting for the task: (returnval){ [ 664.900435] env[59857]: value = "task-1341427" [ 664.900435] env[59857]: _type = "Task" [ 664.900435] env[59857]: } to complete. {{(pid=59857) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 664.909520] env[59857]: DEBUG oslo_vmware.api [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Task: {'id': task-1341427, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59857) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 664.942951] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] Preparing fetch location {{(pid=59857) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 664.943239] env[59857]: DEBUG nova.virt.vmwareapi.ds_util [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Creating directory with path [datastore2] vmware_temp/dfbc6502-9da6-4117-97e8-4440e0c602eb/4a4a4830-1ff7-4cff-ab75-d665942f46b5 {{(pid=59857) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 664.943473] env[59857]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-718870aa-aacf-42f5-95c1-160a1849a302 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 664.956273] env[59857]: DEBUG nova.virt.vmwareapi.ds_util [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Created directory with path [datastore2] vmware_temp/dfbc6502-9da6-4117-97e8-4440e0c602eb/4a4a4830-1ff7-4cff-ab75-d665942f46b5 {{(pid=59857) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 664.956473] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] Fetch image to [datastore2] vmware_temp/dfbc6502-9da6-4117-97e8-4440e0c602eb/4a4a4830-1ff7-4cff-ab75-d665942f46b5/tmp-sparse.vmdk {{(pid=59857) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 664.957466] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] Downloading image file data 4a4a4830-1ff7-4cff-ab75-d665942f46b5 to [datastore2] vmware_temp/dfbc6502-9da6-4117-97e8-4440e0c602eb/4a4a4830-1ff7-4cff-ab75-d665942f46b5/tmp-sparse.vmdk on the data store datastore2 {{(pid=59857) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 664.957580] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe05d9c0-6492-4c53-a60d-b7bce67240b2 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 664.965036] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b8fe946-5bef-4da8-ba0f-de7fdc0e656d {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 664.981101] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f002ce8-c008-4fb5-94ff-02beab951b22 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 665.022672] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b28fd5c-3273-42ed-aaba-9345de333f86 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 665.030177] env[59857]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-686b46eb-4a68-4296-bbee-6e439c371e2b {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 665.128631] env[59857]: DEBUG nova.virt.vmwareapi.images [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] Downloading image file data 4a4a4830-1ff7-4cff-ab75-d665942f46b5 to the data store datastore2 {{(pid=59857) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 665.193383] env[59857]: DEBUG oslo_vmware.rw_handles [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/dfbc6502-9da6-4117-97e8-4440e0c602eb/4a4a4830-1ff7-4cff-ab75-d665942f46b5/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59857) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 665.256275] env[59857]: DEBUG oslo_vmware.rw_handles [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Completed reading data from the image iterator. {{(pid=59857) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 665.256469] env[59857]: DEBUG oslo_vmware.rw_handles [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/dfbc6502-9da6-4117-97e8-4440e0c602eb/4a4a4830-1ff7-4cff-ab75-d665942f46b5/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59857) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 665.413598] env[59857]: DEBUG oslo_vmware.api [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Task: {'id': task-1341427, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.035311} completed successfully. {{(pid=59857) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 665.414395] env[59857]: DEBUG nova.virt.vmwareapi.ds_util [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Deleted the datastore file {{(pid=59857) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 665.415310] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] Deleted contents of the VM from datastore datastore2 {{(pid=59857) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 665.415310] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] Instance destroyed {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 665.415310] env[59857]: INFO nova.compute.manager [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] Took 0.57 seconds to destroy the instance on the hypervisor. [ 665.415738] env[59857]: DEBUG oslo.service.loopingcall [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59857) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 665.416078] env[59857]: DEBUG nova.compute.manager [-] [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] Skipping network deallocation for instance since networking was not requested. {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 665.419243] env[59857]: DEBUG nova.compute.claims [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] Aborting claim: {{(pid=59857) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 665.421106] env[59857]: DEBUG oslo_concurrency.lockutils [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 665.421106] env[59857]: DEBUG oslo_concurrency.lockutils [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 665.553390] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-209a22c9-1afc-471f-b9da-288394083335 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 665.561370] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25182fad-2a9b-4dee-b1f6-2d86494d5ee1 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 665.592653] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-45ed4bf2-480a-4607-95a1-03856aea23e8 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 665.603088] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-177cabb3-dbaa-4a89-9232-93b1589b7685 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 665.619017] env[59857]: DEBUG nova.compute.provider_tree [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 665.629364] env[59857]: DEBUG nova.scheduler.client.report [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 665.652856] env[59857]: DEBUG oslo_concurrency.lockutils [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.231s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 665.652856] env[59857]: ERROR nova.compute.manager [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 665.652856] env[59857]: Faults: ['InvalidArgument'] [ 665.652856] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] Traceback (most recent call last): [ 665.652856] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 665.652856] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] self.driver.spawn(context, instance, image_meta, [ 665.652856] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 665.652856] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 665.652856] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 665.652856] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] self._fetch_image_if_missing(context, vi) [ 665.653587] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 665.653587] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] image_cache(vi, tmp_image_ds_loc) [ 665.653587] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 665.653587] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] vm_util.copy_virtual_disk( [ 665.653587] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 665.653587] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] session._wait_for_task(vmdk_copy_task) [ 665.653587] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 665.653587] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] return self.wait_for_task(task_ref) [ 665.653587] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 665.653587] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] return evt.wait() [ 665.653587] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 665.653587] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] result = hub.switch() [ 665.653587] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 665.654268] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] return self.greenlet.switch() [ 665.654268] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 665.654268] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] self.f(*self.args, **self.kw) [ 665.654268] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 665.654268] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] raise exceptions.translate_fault(task_info.error) [ 665.654268] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 665.654268] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] Faults: ['InvalidArgument'] [ 665.654268] env[59857]: ERROR nova.compute.manager [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] [ 665.654268] env[59857]: DEBUG nova.compute.utils [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] VimFaultException {{(pid=59857) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 665.655345] env[59857]: DEBUG nova.compute.manager [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] Build of instance 21e37459-3ce1-41e7-8317-b98edafb15d4 was re-scheduled: A specified parameter was not correct: fileType [ 665.655345] env[59857]: Faults: ['InvalidArgument'] {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 665.655976] env[59857]: DEBUG nova.compute.manager [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] Unplugging VIFs for instance {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 665.656407] env[59857]: DEBUG oslo_concurrency.lockutils [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Acquiring lock "refresh_cache-21e37459-3ce1-41e7-8317-b98edafb15d4" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 665.656668] env[59857]: DEBUG oslo_concurrency.lockutils [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Acquired lock "refresh_cache-21e37459-3ce1-41e7-8317-b98edafb15d4" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 665.656942] env[59857]: DEBUG nova.network.neutron [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 665.717160] env[59857]: DEBUG nova.network.neutron [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 666.066705] env[59857]: DEBUG nova.network.neutron [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 666.101029] env[59857]: DEBUG oslo_concurrency.lockutils [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Releasing lock "refresh_cache-21e37459-3ce1-41e7-8317-b98edafb15d4" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 666.101029] env[59857]: DEBUG nova.compute.manager [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 666.102202] env[59857]: DEBUG nova.compute.manager [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] Skipping network deallocation for instance since networking was not requested. {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 666.219866] env[59857]: INFO nova.scheduler.client.report [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Deleted allocations for instance 21e37459-3ce1-41e7-8317-b98edafb15d4 [ 666.276822] env[59857]: DEBUG oslo_concurrency.lockutils [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Lock "21e37459-3ce1-41e7-8317-b98edafb15d4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 54.324s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 687.201293] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 687.223655] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 687.223655] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 687.840525] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 687.841543] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 687.841634] env[59857]: DEBUG nova.compute.manager [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Starting heal instance info cache {{(pid=59857) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 687.841840] env[59857]: DEBUG nova.compute.manager [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Rebuilding the list of instances to heal {{(pid=59857) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 687.857187] env[59857]: DEBUG nova.compute.manager [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] Skipping network cache update for instance because it is Building. {{(pid=59857) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 687.857187] env[59857]: DEBUG nova.compute.manager [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Skipping network cache update for instance because it is Building. {{(pid=59857) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 687.857187] env[59857]: DEBUG nova.compute.manager [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Skipping network cache update for instance because it is Building. {{(pid=59857) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 687.857187] env[59857]: DEBUG nova.compute.manager [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Didn't find any instances for network info cache update. {{(pid=59857) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 687.857187] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 687.857187] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 687.857382] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 687.857382] env[59857]: DEBUG nova.compute.manager [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59857) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 687.857547] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager.update_available_resource {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 687.869260] env[59857]: DEBUG oslo_concurrency.lockutils [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 687.869608] env[59857]: DEBUG oslo_concurrency.lockutils [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 687.869850] env[59857]: DEBUG oslo_concurrency.lockutils [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 687.870060] env[59857]: DEBUG nova.compute.resource_tracker [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59857) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 687.872319] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e769b117-d6f6-40e2-935c-80fa5cd806ae {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 687.882676] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dfa2e9da-b850-4407-80bc-caa5fbd8d4a6 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 687.901683] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-72585202-caf7-4b2d-b76a-213bf81d2023 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 687.909676] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9861d46b-2031-4dc7-820e-a3f8edadf0a5 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 687.943221] env[59857]: DEBUG nova.compute.resource_tracker [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181514MB free_disk=154GB free_vcpus=48 pci_devices=None {{(pid=59857) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 687.943414] env[59857]: DEBUG oslo_concurrency.lockutils [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 687.943567] env[59857]: DEBUG oslo_concurrency.lockutils [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 688.009768] env[59857]: DEBUG nova.compute.resource_tracker [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Instance b7ab8792-137d-4053-9df9-3d560aa5e411 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59857) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 688.009926] env[59857]: DEBUG nova.compute.resource_tracker [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Instance 5c575e05-5a7c-49b8-b914-9b4a4e347bfc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59857) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 688.010064] env[59857]: DEBUG nova.compute.resource_tracker [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Instance 11f468ba-a807-4490-9dd5-58eaad007865 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59857) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 688.010249] env[59857]: DEBUG nova.compute.resource_tracker [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Total usable vcpus: 48, total allocated vcpus: 3 {{(pid=59857) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 688.010386] env[59857]: DEBUG nova.compute.resource_tracker [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=896MB phys_disk=200GB used_disk=3GB total_vcpus=48 used_vcpus=3 pci_stats=[] {{(pid=59857) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 688.072942] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-256a562c-8ac4-4514-adfc-07d76041bce1 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 688.081567] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f41d2389-2124-418a-be1b-60a5656685cd {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 688.120195] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-219568bb-27e8-4520-a77a-da35b308fa76 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 688.127727] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a7aac912-c919-4683-b32f-a9dac89ee958 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 688.142174] env[59857]: DEBUG nova.compute.provider_tree [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 688.151271] env[59857]: DEBUG nova.scheduler.client.report [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 688.168981] env[59857]: DEBUG nova.compute.resource_tracker [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59857) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 688.169177] env[59857]: DEBUG oslo_concurrency.lockutils [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.226s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 689.152559] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 696.634371] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Acquiring lock "67b01666-6233-4af8-a0ec-a4e938b82606" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 696.634637] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Lock "67b01666-6233-4af8-a0ec-a4e938b82606" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 696.647488] env[59857]: DEBUG nova.compute.manager [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] Starting instance... {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 696.720828] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 696.720828] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 696.722311] env[59857]: INFO nova.compute.claims [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 696.892039] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c5943d6-e24f-493f-8ef3-775290e3d8bc {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.901157] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27aa73cc-5870-4985-89b6-cfcb7eb96962 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.941577] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c2759a8-4377-4904-9692-738ed30df198 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.949855] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5331c235-4e2c-44f2-a44a-f2bfbdea89cf {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.966325] env[59857]: DEBUG nova.compute.provider_tree [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 696.976254] env[59857]: DEBUG nova.scheduler.client.report [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 696.990578] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.270s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 696.991092] env[59857]: DEBUG nova.compute.manager [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] Start building networks asynchronously for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 697.040072] env[59857]: DEBUG nova.compute.utils [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Using /dev/sd instead of None {{(pid=59857) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 697.042279] env[59857]: DEBUG nova.compute.manager [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] Allocating IP information in the background. {{(pid=59857) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 697.042495] env[59857]: DEBUG nova.network.neutron [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] allocate_for_instance() {{(pid=59857) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 697.069216] env[59857]: DEBUG nova.compute.manager [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] Start building block device mappings for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 697.159996] env[59857]: DEBUG nova.compute.manager [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] Start spawning the instance on the hypervisor. {{(pid=59857) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 697.187129] env[59857]: DEBUG nova.virt.hardware [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-23T14:43:14Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-23T14:42:57Z,direct_url=,disk_format='vmdk',id=4a4a4830-1ff7-4cff-ab75-d665942f46b5,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='88755ffc9d394d14928f0692771e61f5',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-23T14:42:58Z,virtual_size=,visibility=), allow threads: False {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 697.187129] env[59857]: DEBUG nova.virt.hardware [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Flavor limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 697.187320] env[59857]: DEBUG nova.virt.hardware [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Image limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 697.187489] env[59857]: DEBUG nova.virt.hardware [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Flavor pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 697.187779] env[59857]: DEBUG nova.virt.hardware [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Image pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 697.187779] env[59857]: DEBUG nova.virt.hardware [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 697.187961] env[59857]: DEBUG nova.virt.hardware [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 697.188404] env[59857]: DEBUG nova.virt.hardware [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 697.188595] env[59857]: DEBUG nova.virt.hardware [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Got 1 possible topologies {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 697.188756] env[59857]: DEBUG nova.virt.hardware [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 697.188920] env[59857]: DEBUG nova.virt.hardware [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 697.190172] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88c14034-6a32-416c-a277-baf11df7ef15 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.203930] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06a59429-7e16-4b49-bbf2-05c04e9be4e7 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.689835] env[59857]: DEBUG nova.policy [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9a2e115241bf4f4491e4736c14c8c75f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '38f2b7c76cb04e49b1b8ac75980011b2', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59857) authorize /opt/stack/nova/nova/policy.py:203}} [ 699.351658] env[59857]: DEBUG nova.network.neutron [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] Successfully created port: f4843203-5a26-458c-986f-a4c59da7d9c3 {{(pid=59857) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 700.408961] env[59857]: DEBUG oslo_concurrency.lockutils [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Acquiring lock "37331817-f277-4f32-8d5a-11e1cf63f2b7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 700.409246] env[59857]: DEBUG oslo_concurrency.lockutils [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Lock "37331817-f277-4f32-8d5a-11e1cf63f2b7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 700.431543] env[59857]: DEBUG nova.compute.manager [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] Starting instance... {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 700.503096] env[59857]: DEBUG oslo_concurrency.lockutils [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 700.503096] env[59857]: DEBUG oslo_concurrency.lockutils [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 700.503096] env[59857]: INFO nova.compute.claims [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 700.598683] env[59857]: DEBUG oslo_concurrency.lockutils [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Acquiring lock "5b59d527-232e-4ef1-bc83-4e8671607db1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 700.599136] env[59857]: DEBUG oslo_concurrency.lockutils [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Lock "5b59d527-232e-4ef1-bc83-4e8671607db1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 700.612155] env[59857]: DEBUG nova.compute.manager [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] Starting instance... {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 700.676953] env[59857]: DEBUG oslo_concurrency.lockutils [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 700.688299] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f08458f-c71c-4f8a-bcc7-daadf0433c81 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 700.696456] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79ee49f5-7020-4629-bf48-a3ffae6f7564 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 700.730674] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-543f47a5-559a-4e99-b813-5c114692bef9 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 700.738603] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-930c55ce-a8ec-4470-826f-ab80fc652df7 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 700.753800] env[59857]: DEBUG nova.compute.provider_tree [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 700.764626] env[59857]: DEBUG nova.scheduler.client.report [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 700.778626] env[59857]: DEBUG oslo_concurrency.lockutils [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.278s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 700.779055] env[59857]: DEBUG nova.compute.manager [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] Start building networks asynchronously for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 700.784691] env[59857]: DEBUG oslo_concurrency.lockutils [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.105s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 700.784691] env[59857]: INFO nova.compute.claims [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 700.818830] env[59857]: DEBUG nova.compute.utils [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Using /dev/sd instead of None {{(pid=59857) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 700.822428] env[59857]: DEBUG nova.compute.manager [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] Allocating IP information in the background. {{(pid=59857) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 700.822626] env[59857]: DEBUG nova.network.neutron [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] allocate_for_instance() {{(pid=59857) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 700.829446] env[59857]: DEBUG nova.compute.manager [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] Start building block device mappings for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 700.915092] env[59857]: DEBUG nova.compute.manager [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] Start spawning the instance on the hypervisor. {{(pid=59857) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 700.918984] env[59857]: DEBUG nova.policy [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5ed904e8ae1e4c3093ea78c668aa6573', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0ca4a4d35bf24c4888af9ba6b2f4717b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59857) authorize /opt/stack/nova/nova/policy.py:203}} [ 700.943753] env[59857]: DEBUG nova.virt.hardware [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-23T14:43:14Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-23T14:42:57Z,direct_url=,disk_format='vmdk',id=4a4a4830-1ff7-4cff-ab75-d665942f46b5,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='88755ffc9d394d14928f0692771e61f5',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-23T14:42:58Z,virtual_size=,visibility=), allow threads: False {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 700.943969] env[59857]: DEBUG nova.virt.hardware [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Flavor limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 700.944177] env[59857]: DEBUG nova.virt.hardware [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Image limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 700.944381] env[59857]: DEBUG nova.virt.hardware [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Flavor pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 700.944611] env[59857]: DEBUG nova.virt.hardware [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Image pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 700.944869] env[59857]: DEBUG nova.virt.hardware [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 700.945020] env[59857]: DEBUG nova.virt.hardware [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 700.945186] env[59857]: DEBUG nova.virt.hardware [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 700.945502] env[59857]: DEBUG nova.virt.hardware [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Got 1 possible topologies {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 700.945502] env[59857]: DEBUG nova.virt.hardware [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 700.945669] env[59857]: DEBUG nova.virt.hardware [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 700.946757] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c07e51aa-5335-496b-8977-f029c938f010 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 700.950682] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e788402-fc1b-4bd5-b828-478a6ad7c280 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 700.961749] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56734c77-010f-42ae-b3d0-c5802c556972 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 700.966665] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d51e96e-86f7-4342-8c71-0a5062208be6 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 701.007658] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc66073e-32ad-4020-aa94-e2fc5f573ac1 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 701.013423] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a0db4bb-ae14-4dbc-b192-9cec3fc86a90 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 701.028267] env[59857]: DEBUG nova.compute.provider_tree [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 701.045192] env[59857]: DEBUG nova.scheduler.client.report [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 701.063660] env[59857]: DEBUG oslo_concurrency.lockutils [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.282s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 701.064314] env[59857]: DEBUG nova.compute.manager [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] Start building networks asynchronously for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 701.112608] env[59857]: DEBUG nova.compute.utils [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Using /dev/sd instead of None {{(pid=59857) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 701.114855] env[59857]: DEBUG nova.compute.manager [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] Allocating IP information in the background. {{(pid=59857) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 701.114855] env[59857]: DEBUG nova.network.neutron [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] allocate_for_instance() {{(pid=59857) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 701.127112] env[59857]: DEBUG nova.compute.manager [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] Start building block device mappings for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 701.198148] env[59857]: DEBUG nova.policy [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4f78185aeccd4e96b19c49aa985f446d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6448dab5c83b44e48c3ea2bd37691788', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59857) authorize /opt/stack/nova/nova/policy.py:203}} [ 701.211167] env[59857]: DEBUG nova.compute.manager [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] Start spawning the instance on the hypervisor. {{(pid=59857) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 701.239248] env[59857]: DEBUG nova.virt.hardware [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-23T14:43:14Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-23T14:42:57Z,direct_url=,disk_format='vmdk',id=4a4a4830-1ff7-4cff-ab75-d665942f46b5,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='88755ffc9d394d14928f0692771e61f5',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-23T14:42:58Z,virtual_size=,visibility=), allow threads: False {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 701.239491] env[59857]: DEBUG nova.virt.hardware [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Flavor limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 701.239642] env[59857]: DEBUG nova.virt.hardware [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Image limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 701.240207] env[59857]: DEBUG nova.virt.hardware [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Flavor pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 701.240207] env[59857]: DEBUG nova.virt.hardware [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Image pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 701.241182] env[59857]: DEBUG nova.virt.hardware [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 701.244692] env[59857]: DEBUG nova.virt.hardware [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 701.244692] env[59857]: DEBUG nova.virt.hardware [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 701.244692] env[59857]: DEBUG nova.virt.hardware [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Got 1 possible topologies {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 701.244692] env[59857]: DEBUG nova.virt.hardware [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 701.244692] env[59857]: DEBUG nova.virt.hardware [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 701.244926] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27de3128-6b1d-4825-978b-3ff07a96944a {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 701.251308] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3c864e5-e66e-4d1a-b136-92f9ad847a1f {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 701.639715] env[59857]: DEBUG nova.network.neutron [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] Successfully created port: b8a09541-861d-4a6e-bed9-a46fc8baadf6 {{(pid=59857) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 702.254414] env[59857]: DEBUG nova.network.neutron [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] Successfully created port: 3fc4c3ad-2447-45d8-941c-973977c7c5b9 {{(pid=59857) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 702.423865] env[59857]: DEBUG oslo_concurrency.lockutils [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "5ad1fabc-bae4-47cb-9b27-42c86c4b02e8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 702.424212] env[59857]: DEBUG oslo_concurrency.lockutils [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "5ad1fabc-bae4-47cb-9b27-42c86c4b02e8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 702.439432] env[59857]: DEBUG nova.compute.manager [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] Starting instance... {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 702.506126] env[59857]: DEBUG oslo_concurrency.lockutils [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 702.507035] env[59857]: DEBUG oslo_concurrency.lockutils [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 702.508057] env[59857]: INFO nova.compute.claims [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 702.689961] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99f19877-6b53-4c01-abd2-5e7c7b626e8a {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 702.701416] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fab423d3-2070-4431-9294-9c6bbe08c0cf {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 702.736251] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad2182c8-1cc2-4cb5-9e2b-0184834f0e04 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 702.748091] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34cd4483-f6a1-4423-9310-74b89dbd1c99 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 702.763712] env[59857]: DEBUG nova.compute.provider_tree [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 702.772732] env[59857]: DEBUG nova.scheduler.client.report [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 702.799136] env[59857]: DEBUG oslo_concurrency.lockutils [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.293s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 702.799670] env[59857]: DEBUG nova.compute.manager [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] Start building networks asynchronously for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 702.854080] env[59857]: DEBUG nova.compute.utils [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Using /dev/sd instead of None {{(pid=59857) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 702.854080] env[59857]: DEBUG nova.compute.manager [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] Allocating IP information in the background. {{(pid=59857) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 702.854080] env[59857]: DEBUG nova.network.neutron [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] allocate_for_instance() {{(pid=59857) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 702.867675] env[59857]: DEBUG nova.compute.manager [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] Start building block device mappings for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 702.958657] env[59857]: DEBUG nova.compute.manager [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] Start spawning the instance on the hypervisor. {{(pid=59857) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 702.988528] env[59857]: DEBUG nova.virt.hardware [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-23T14:43:14Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-23T14:42:57Z,direct_url=,disk_format='vmdk',id=4a4a4830-1ff7-4cff-ab75-d665942f46b5,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='88755ffc9d394d14928f0692771e61f5',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-23T14:42:58Z,virtual_size=,visibility=), allow threads: False {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 702.988941] env[59857]: DEBUG nova.virt.hardware [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Flavor limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 702.988941] env[59857]: DEBUG nova.virt.hardware [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Image limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 702.989151] env[59857]: DEBUG nova.virt.hardware [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Flavor pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 702.989205] env[59857]: DEBUG nova.virt.hardware [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Image pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 702.989332] env[59857]: DEBUG nova.virt.hardware [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 702.989536] env[59857]: DEBUG nova.virt.hardware [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 702.989842] env[59857]: DEBUG nova.virt.hardware [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 702.989842] env[59857]: DEBUG nova.virt.hardware [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Got 1 possible topologies {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 702.989997] env[59857]: DEBUG nova.virt.hardware [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 702.990263] env[59857]: DEBUG nova.virt.hardware [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 702.991109] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53815bd5-ba7d-453a-99a1-0d9402052398 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 703.002983] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e039402b-ab30-4b80-8aec-89c633ac64e8 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 703.238232] env[59857]: DEBUG nova.policy [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9ae0c3fdf5814c20819e4329e87733e3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd742fb05f93f44a9b9c8207f47e77730', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59857) authorize /opt/stack/nova/nova/policy.py:203}} [ 705.130089] env[59857]: DEBUG nova.network.neutron [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] Successfully created port: 301ba51c-e66d-4f6a-b589-40341a138ddf {{(pid=59857) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 705.646427] env[59857]: DEBUG oslo_concurrency.lockutils [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Acquiring lock "db69e0d0-724b-4a87-80f5-390cfc395ee9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 705.648855] env[59857]: DEBUG oslo_concurrency.lockutils [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Lock "db69e0d0-724b-4a87-80f5-390cfc395ee9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 705.663103] env[59857]: DEBUG nova.compute.manager [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] Starting instance... {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 705.731168] env[59857]: DEBUG oslo_concurrency.lockutils [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 705.731168] env[59857]: DEBUG oslo_concurrency.lockutils [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 705.732018] env[59857]: INFO nova.compute.claims [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 705.940910] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9585aba5-2b0d-4d37-b8cf-a36e28f09dab {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 705.955036] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1fc00549-3c75-495d-9d5a-9c9eb6502d9a {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 705.996446] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e7aa8e8-b4e5-4635-9871-6e23adcf3917 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.006227] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74e6cb27-5758-4f5f-be88-e0f2a75f3d9b {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.026357] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Acquiring lock "29c85bbf-553e-4b82-ad7c-5341ffc5af63" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 706.026357] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "29c85bbf-553e-4b82-ad7c-5341ffc5af63" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 706.027172] env[59857]: DEBUG nova.compute.provider_tree [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 706.037649] env[59857]: DEBUG nova.scheduler.client.report [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 706.043137] env[59857]: DEBUG nova.compute.manager [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] Starting instance... {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 706.057729] env[59857]: DEBUG oslo_concurrency.lockutils [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.324s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 706.057729] env[59857]: DEBUG nova.compute.manager [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] Start building networks asynchronously for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 706.097861] env[59857]: DEBUG nova.compute.utils [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Using /dev/sd instead of None {{(pid=59857) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 706.100228] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 706.100498] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 706.101821] env[59857]: INFO nova.compute.claims [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 706.104034] env[59857]: DEBUG nova.compute.manager [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] Allocating IP information in the background. {{(pid=59857) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 706.104295] env[59857]: DEBUG nova.network.neutron [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] allocate_for_instance() {{(pid=59857) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 706.117891] env[59857]: DEBUG nova.compute.manager [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] Start building block device mappings for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 706.194417] env[59857]: DEBUG nova.compute.manager [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] Start spawning the instance on the hypervisor. {{(pid=59857) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 706.222587] env[59857]: DEBUG nova.virt.hardware [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-23T14:43:14Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-23T14:42:57Z,direct_url=,disk_format='vmdk',id=4a4a4830-1ff7-4cff-ab75-d665942f46b5,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='88755ffc9d394d14928f0692771e61f5',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-23T14:42:58Z,virtual_size=,visibility=), allow threads: False {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 706.222587] env[59857]: DEBUG nova.virt.hardware [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Flavor limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 706.222587] env[59857]: DEBUG nova.virt.hardware [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Image limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 706.223287] env[59857]: DEBUG nova.virt.hardware [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Flavor pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 706.223287] env[59857]: DEBUG nova.virt.hardware [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Image pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 706.223287] env[59857]: DEBUG nova.virt.hardware [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 706.223287] env[59857]: DEBUG nova.virt.hardware [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 706.223287] env[59857]: DEBUG nova.virt.hardware [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 706.223525] env[59857]: DEBUG nova.virt.hardware [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Got 1 possible topologies {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 706.223525] env[59857]: DEBUG nova.virt.hardware [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 706.223525] env[59857]: DEBUG nova.virt.hardware [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 706.223525] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eaf8f527-e879-46cc-943d-7eec475373cd {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.235305] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23d0bca3-5217-41e1-b6ca-c4603c036cef {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.322141] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78fb075d-4135-4db1-b9ce-71fbb4abd3ea {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.331420] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2421c73-f219-46d1-ae85-3a567a649b0e {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.367439] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12279e35-afb1-4287-9947-52c52de9be40 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.376268] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fec81efe-fb80-47d7-a893-03973aaad952 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.389206] env[59857]: DEBUG nova.compute.provider_tree [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 706.399226] env[59857]: DEBUG nova.scheduler.client.report [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 706.417078] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.316s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 706.417581] env[59857]: DEBUG nova.compute.manager [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] Start building networks asynchronously for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 706.459336] env[59857]: DEBUG nova.compute.utils [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Using /dev/sd instead of None {{(pid=59857) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 706.460420] env[59857]: DEBUG nova.compute.manager [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] Allocating IP information in the background. {{(pid=59857) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 706.460589] env[59857]: DEBUG nova.network.neutron [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] allocate_for_instance() {{(pid=59857) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 706.472455] env[59857]: DEBUG nova.compute.manager [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] Start building block device mappings for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 706.538783] env[59857]: DEBUG nova.compute.manager [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] Start spawning the instance on the hypervisor. {{(pid=59857) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 706.541806] env[59857]: ERROR nova.compute.manager [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port f4843203-5a26-458c-986f-a4c59da7d9c3, please check neutron logs for more information. [ 706.541806] env[59857]: ERROR nova.compute.manager Traceback (most recent call last): [ 706.541806] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 706.541806] env[59857]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 706.541806] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 706.541806] env[59857]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 706.541806] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 706.541806] env[59857]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 706.541806] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 706.541806] env[59857]: ERROR nova.compute.manager self.force_reraise() [ 706.541806] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 706.541806] env[59857]: ERROR nova.compute.manager raise self.value [ 706.541806] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 706.541806] env[59857]: ERROR nova.compute.manager updated_port = self._update_port( [ 706.541806] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 706.541806] env[59857]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 706.542223] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 706.542223] env[59857]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 706.542223] env[59857]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port f4843203-5a26-458c-986f-a4c59da7d9c3, please check neutron logs for more information. [ 706.542223] env[59857]: ERROR nova.compute.manager [ 706.542223] env[59857]: Traceback (most recent call last): [ 706.542223] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 706.542223] env[59857]: listener.cb(fileno) [ 706.542223] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 706.542223] env[59857]: result = function(*args, **kwargs) [ 706.542223] env[59857]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 706.542223] env[59857]: return func(*args, **kwargs) [ 706.542223] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 706.542223] env[59857]: raise e [ 706.542223] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 706.542223] env[59857]: nwinfo = self.network_api.allocate_for_instance( [ 706.542223] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 706.542223] env[59857]: created_port_ids = self._update_ports_for_instance( [ 706.542223] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 706.542223] env[59857]: with excutils.save_and_reraise_exception(): [ 706.542223] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 706.542223] env[59857]: self.force_reraise() [ 706.542223] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 706.542223] env[59857]: raise self.value [ 706.542223] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 706.542223] env[59857]: updated_port = self._update_port( [ 706.542223] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 706.542223] env[59857]: _ensure_no_port_binding_failure(port) [ 706.542223] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 706.542223] env[59857]: raise exception.PortBindingFailed(port_id=port['id']) [ 706.543370] env[59857]: nova.exception.PortBindingFailed: Binding failed for port f4843203-5a26-458c-986f-a4c59da7d9c3, please check neutron logs for more information. [ 706.543370] env[59857]: Removing descriptor: 17 [ 706.543370] env[59857]: ERROR nova.compute.manager [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port f4843203-5a26-458c-986f-a4c59da7d9c3, please check neutron logs for more information. [ 706.543370] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] Traceback (most recent call last): [ 706.543370] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 706.543370] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] yield resources [ 706.543370] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 706.543370] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] self.driver.spawn(context, instance, image_meta, [ 706.543370] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 706.543370] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] self._vmops.spawn(context, instance, image_meta, injected_files, [ 706.543370] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 706.543370] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] vm_ref = self.build_virtual_machine(instance, [ 706.543685] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 706.543685] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] vif_infos = vmwarevif.get_vif_info(self._session, [ 706.543685] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 706.543685] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] for vif in network_info: [ 706.543685] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 706.543685] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] return self._sync_wrapper(fn, *args, **kwargs) [ 706.543685] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 706.543685] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] self.wait() [ 706.543685] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 706.543685] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] self[:] = self._gt.wait() [ 706.543685] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 706.543685] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] return self._exit_event.wait() [ 706.543685] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 706.543685] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] result = hub.switch() [ 706.544114] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 706.544114] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] return self.greenlet.switch() [ 706.544114] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 706.544114] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] result = function(*args, **kwargs) [ 706.544114] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 706.544114] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] return func(*args, **kwargs) [ 706.544114] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 706.544114] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] raise e [ 706.544114] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 706.544114] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] nwinfo = self.network_api.allocate_for_instance( [ 706.544114] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 706.544114] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] created_port_ids = self._update_ports_for_instance( [ 706.544114] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 706.544476] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] with excutils.save_and_reraise_exception(): [ 706.544476] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 706.544476] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] self.force_reraise() [ 706.544476] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 706.544476] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] raise self.value [ 706.544476] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 706.544476] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] updated_port = self._update_port( [ 706.544476] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 706.544476] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] _ensure_no_port_binding_failure(port) [ 706.544476] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 706.544476] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] raise exception.PortBindingFailed(port_id=port['id']) [ 706.544476] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] nova.exception.PortBindingFailed: Binding failed for port f4843203-5a26-458c-986f-a4c59da7d9c3, please check neutron logs for more information. [ 706.544476] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] [ 706.544836] env[59857]: INFO nova.compute.manager [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] Terminating instance [ 706.545814] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Acquiring lock "refresh_cache-67b01666-6233-4af8-a0ec-a4e938b82606" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 706.545814] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Acquired lock "refresh_cache-67b01666-6233-4af8-a0ec-a4e938b82606" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 706.545814] env[59857]: DEBUG nova.network.neutron [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 706.565434] env[59857]: DEBUG nova.virt.hardware [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-23T14:43:14Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-23T14:42:57Z,direct_url=,disk_format='vmdk',id=4a4a4830-1ff7-4cff-ab75-d665942f46b5,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='88755ffc9d394d14928f0692771e61f5',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-23T14:42:58Z,virtual_size=,visibility=), allow threads: False {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 706.565657] env[59857]: DEBUG nova.virt.hardware [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Flavor limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 706.565956] env[59857]: DEBUG nova.virt.hardware [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Image limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 706.566295] env[59857]: DEBUG nova.virt.hardware [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Flavor pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 706.566295] env[59857]: DEBUG nova.virt.hardware [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Image pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 706.566295] env[59857]: DEBUG nova.virt.hardware [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 706.567186] env[59857]: DEBUG nova.virt.hardware [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 706.567186] env[59857]: DEBUG nova.virt.hardware [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 706.567186] env[59857]: DEBUG nova.virt.hardware [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Got 1 possible topologies {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 706.567186] env[59857]: DEBUG nova.virt.hardware [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 706.567186] env[59857]: DEBUG nova.virt.hardware [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 706.569955] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec55f5f3-b484-42df-86af-e74372b71d50 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.576488] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a1da92d-290f-4818-b231-a33c9ce287c6 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.616043] env[59857]: DEBUG nova.policy [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9f30a9b74cf54f4dbc2aa343e2b3298e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3a9c7dc4e938463781dbded4e5382286', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59857) authorize /opt/stack/nova/nova/policy.py:203}} [ 706.654265] env[59857]: DEBUG nova.network.neutron [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 706.729887] env[59857]: DEBUG nova.policy [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'baef766334764dd9ab481d3a2aacd07b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a9e33b2e4b8c439a8e8a557ddda22fce', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59857) authorize /opt/stack/nova/nova/policy.py:203}} [ 706.851275] env[59857]: DEBUG oslo_concurrency.lockutils [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Acquiring lock "40115f76-28d8-4f39-9dca-59401f52f22f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 706.851506] env[59857]: DEBUG oslo_concurrency.lockutils [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Lock "40115f76-28d8-4f39-9dca-59401f52f22f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 706.866861] env[59857]: DEBUG nova.compute.manager [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] Starting instance... {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 706.917058] env[59857]: DEBUG oslo_concurrency.lockutils [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 706.917292] env[59857]: DEBUG oslo_concurrency.lockutils [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 706.918706] env[59857]: INFO nova.compute.claims [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 707.101031] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f77690d4-9217-4eb4-b853-c3b90fc668da {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.109233] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d91d7820-f818-41fd-8d34-4110a754ac21 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.142388] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d209c6f0-d937-42a0-86b4-a2b3b496d2b1 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.151121] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05da5ef8-9d21-4ed8-b54f-6cf9e7051b45 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.166092] env[59857]: DEBUG nova.compute.provider_tree [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 707.168075] env[59857]: DEBUG nova.network.neutron [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 707.176215] env[59857]: DEBUG nova.scheduler.client.report [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 707.184294] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Releasing lock "refresh_cache-67b01666-6233-4af8-a0ec-a4e938b82606" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 707.184668] env[59857]: DEBUG nova.compute.manager [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] Start destroying the instance on the hypervisor. {{(pid=59857) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 707.184848] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] Destroying instance {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 707.185472] env[59857]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c1b181a7-8bca-4a09-8b00-c20e81649403 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.195627] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62806425-b5da-4db8-b793-37360137f1a7 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.209446] env[59857]: DEBUG oslo_concurrency.lockutils [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.292s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 707.209855] env[59857]: DEBUG nova.compute.manager [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] Start building networks asynchronously for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 707.228340] env[59857]: WARNING nova.virt.vmwareapi.vmops [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 67b01666-6233-4af8-a0ec-a4e938b82606 could not be found. [ 707.228651] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] Instance destroyed {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 707.228848] env[59857]: INFO nova.compute.manager [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] Took 0.04 seconds to destroy the instance on the hypervisor. [ 707.229291] env[59857]: DEBUG oslo.service.loopingcall [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59857) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 707.230183] env[59857]: DEBUG nova.compute.manager [-] [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 707.230289] env[59857]: DEBUG nova.network.neutron [-] [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 707.259749] env[59857]: DEBUG nova.compute.utils [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Using /dev/sd instead of None {{(pid=59857) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 707.259749] env[59857]: DEBUG nova.compute.manager [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] Allocating IP information in the background. {{(pid=59857) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 707.259749] env[59857]: DEBUG nova.network.neutron [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] allocate_for_instance() {{(pid=59857) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 707.270868] env[59857]: DEBUG nova.compute.manager [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] Start building block device mappings for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 707.337915] env[59857]: DEBUG nova.compute.manager [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] Start spawning the instance on the hypervisor. {{(pid=59857) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 707.360505] env[59857]: DEBUG nova.virt.hardware [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-23T14:43:14Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-23T14:42:57Z,direct_url=,disk_format='vmdk',id=4a4a4830-1ff7-4cff-ab75-d665942f46b5,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='88755ffc9d394d14928f0692771e61f5',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-23T14:42:58Z,virtual_size=,visibility=), allow threads: False {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 707.360725] env[59857]: DEBUG nova.virt.hardware [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Flavor limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 707.360904] env[59857]: DEBUG nova.virt.hardware [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Image limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 707.361176] env[59857]: DEBUG nova.virt.hardware [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Flavor pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 707.361433] env[59857]: DEBUG nova.virt.hardware [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Image pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 707.361639] env[59857]: DEBUG nova.virt.hardware [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 707.365079] env[59857]: DEBUG nova.virt.hardware [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 707.365079] env[59857]: DEBUG nova.virt.hardware [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 707.365079] env[59857]: DEBUG nova.virt.hardware [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Got 1 possible topologies {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 707.365079] env[59857]: DEBUG nova.virt.hardware [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 707.365079] env[59857]: DEBUG nova.virt.hardware [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 707.365277] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36b71d6b-9e85-4b4f-a445-8e1a4ccfb954 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.373890] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06381384-85ac-4c01-962d-4dc586166bd8 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.472947] env[59857]: DEBUG nova.network.neutron [-] [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 707.482014] env[59857]: DEBUG nova.network.neutron [-] [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 707.490566] env[59857]: INFO nova.compute.manager [-] [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] Took 0.26 seconds to deallocate network for instance. [ 707.492533] env[59857]: DEBUG nova.compute.claims [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] Aborting claim: {{(pid=59857) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 707.492732] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 707.492963] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 707.703847] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38aaa319-ebc8-44fa-b73c-b10ea4c61c4d {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.713087] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1dff142-9875-4d81-bf9a-148479e871df {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.753619] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f1d4c3e-9134-4967-b2b5-4f329b18b554 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.762491] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd23e4f0-25e1-4ede-8876-4de913bb21e8 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.778175] env[59857]: DEBUG nova.compute.provider_tree [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 707.790785] env[59857]: DEBUG nova.scheduler.client.report [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 707.809584] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.316s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 707.810391] env[59857]: ERROR nova.compute.manager [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port f4843203-5a26-458c-986f-a4c59da7d9c3, please check neutron logs for more information. [ 707.810391] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] Traceback (most recent call last): [ 707.810391] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 707.810391] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] self.driver.spawn(context, instance, image_meta, [ 707.810391] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 707.810391] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] self._vmops.spawn(context, instance, image_meta, injected_files, [ 707.810391] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 707.810391] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] vm_ref = self.build_virtual_machine(instance, [ 707.810391] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 707.810391] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] vif_infos = vmwarevif.get_vif_info(self._session, [ 707.810391] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 707.810683] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] for vif in network_info: [ 707.810683] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 707.810683] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] return self._sync_wrapper(fn, *args, **kwargs) [ 707.810683] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 707.810683] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] self.wait() [ 707.810683] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 707.810683] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] self[:] = self._gt.wait() [ 707.810683] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 707.810683] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] return self._exit_event.wait() [ 707.810683] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 707.810683] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] result = hub.switch() [ 707.810683] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 707.810683] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] return self.greenlet.switch() [ 707.810683] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 707.811023] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] result = function(*args, **kwargs) [ 707.811023] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 707.811023] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] return func(*args, **kwargs) [ 707.811023] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 707.811023] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] raise e [ 707.811023] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 707.811023] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] nwinfo = self.network_api.allocate_for_instance( [ 707.811023] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 707.811023] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] created_port_ids = self._update_ports_for_instance( [ 707.811023] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 707.811023] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] with excutils.save_and_reraise_exception(): [ 707.811023] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 707.811023] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] self.force_reraise() [ 707.811332] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 707.811332] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] raise self.value [ 707.811332] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 707.811332] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] updated_port = self._update_port( [ 707.811332] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 707.811332] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] _ensure_no_port_binding_failure(port) [ 707.811332] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 707.811332] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] raise exception.PortBindingFailed(port_id=port['id']) [ 707.811332] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] nova.exception.PortBindingFailed: Binding failed for port f4843203-5a26-458c-986f-a4c59da7d9c3, please check neutron logs for more information. [ 707.811332] env[59857]: ERROR nova.compute.manager [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] [ 707.811801] env[59857]: DEBUG nova.compute.utils [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] Binding failed for port f4843203-5a26-458c-986f-a4c59da7d9c3, please check neutron logs for more information. {{(pid=59857) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 707.813396] env[59857]: DEBUG nova.compute.manager [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] Build of instance 67b01666-6233-4af8-a0ec-a4e938b82606 was re-scheduled: Binding failed for port f4843203-5a26-458c-986f-a4c59da7d9c3, please check neutron logs for more information. {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 707.813867] env[59857]: DEBUG nova.compute.manager [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] Unplugging VIFs for instance {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 707.814161] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Acquiring lock "refresh_cache-67b01666-6233-4af8-a0ec-a4e938b82606" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 707.814476] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Acquired lock "refresh_cache-67b01666-6233-4af8-a0ec-a4e938b82606" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 707.814680] env[59857]: DEBUG nova.network.neutron [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 707.865578] env[59857]: DEBUG nova.network.neutron [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 707.952460] env[59857]: DEBUG nova.policy [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3f6fc368c3264c3b902fa4539548cb86', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3794741ad1ea4973b8fbab68114f28c3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59857) authorize /opt/stack/nova/nova/policy.py:203}} [ 708.253395] env[59857]: DEBUG nova.network.neutron [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 708.265445] env[59857]: DEBUG nova.network.neutron [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] Successfully created port: 879190bd-d68a-455e-abd2-73e8f85c3e28 {{(pid=59857) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 708.269536] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Releasing lock "refresh_cache-67b01666-6233-4af8-a0ec-a4e938b82606" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 708.269737] env[59857]: DEBUG nova.compute.manager [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 708.270131] env[59857]: DEBUG nova.compute.manager [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 708.270366] env[59857]: DEBUG nova.network.neutron [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 708.334824] env[59857]: DEBUG nova.network.neutron [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 708.343182] env[59857]: DEBUG nova.network.neutron [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 708.374611] env[59857]: INFO nova.compute.manager [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] Took 0.10 seconds to deallocate network for instance. [ 708.487457] env[59857]: INFO nova.scheduler.client.report [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Deleted allocations for instance 67b01666-6233-4af8-a0ec-a4e938b82606 [ 708.515208] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Lock "67b01666-6233-4af8-a0ec-a4e938b82606" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 11.880s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 709.048184] env[59857]: DEBUG nova.network.neutron [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] Successfully created port: 8c4b7ea2-a0f8-4f3d-a4c9-38b3e19ff1f8 {{(pid=59857) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 709.119703] env[59857]: ERROR nova.compute.manager [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port b8a09541-861d-4a6e-bed9-a46fc8baadf6, please check neutron logs for more information. [ 709.119703] env[59857]: ERROR nova.compute.manager Traceback (most recent call last): [ 709.119703] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 709.119703] env[59857]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 709.119703] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 709.119703] env[59857]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 709.119703] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 709.119703] env[59857]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 709.119703] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 709.119703] env[59857]: ERROR nova.compute.manager self.force_reraise() [ 709.119703] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 709.119703] env[59857]: ERROR nova.compute.manager raise self.value [ 709.119703] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 709.119703] env[59857]: ERROR nova.compute.manager updated_port = self._update_port( [ 709.119703] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 709.119703] env[59857]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 709.120261] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 709.120261] env[59857]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 709.120261] env[59857]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port b8a09541-861d-4a6e-bed9-a46fc8baadf6, please check neutron logs for more information. [ 709.120261] env[59857]: ERROR nova.compute.manager [ 709.120261] env[59857]: Traceback (most recent call last): [ 709.120261] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 709.120261] env[59857]: listener.cb(fileno) [ 709.120261] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 709.120261] env[59857]: result = function(*args, **kwargs) [ 709.120261] env[59857]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 709.120261] env[59857]: return func(*args, **kwargs) [ 709.120261] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 709.120261] env[59857]: raise e [ 709.120261] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 709.120261] env[59857]: nwinfo = self.network_api.allocate_for_instance( [ 709.120261] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 709.120261] env[59857]: created_port_ids = self._update_ports_for_instance( [ 709.120261] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 709.120261] env[59857]: with excutils.save_and_reraise_exception(): [ 709.120261] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 709.120261] env[59857]: self.force_reraise() [ 709.120261] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 709.120261] env[59857]: raise self.value [ 709.120261] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 709.120261] env[59857]: updated_port = self._update_port( [ 709.120261] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 709.120261] env[59857]: _ensure_no_port_binding_failure(port) [ 709.120261] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 709.120261] env[59857]: raise exception.PortBindingFailed(port_id=port['id']) [ 709.120874] env[59857]: nova.exception.PortBindingFailed: Binding failed for port b8a09541-861d-4a6e-bed9-a46fc8baadf6, please check neutron logs for more information. [ 709.120874] env[59857]: Removing descriptor: 14 [ 709.120874] env[59857]: ERROR nova.compute.manager [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port b8a09541-861d-4a6e-bed9-a46fc8baadf6, please check neutron logs for more information. [ 709.120874] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] Traceback (most recent call last): [ 709.120874] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 709.120874] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] yield resources [ 709.120874] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 709.120874] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] self.driver.spawn(context, instance, image_meta, [ 709.120874] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 709.120874] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 709.120874] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 709.120874] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] vm_ref = self.build_virtual_machine(instance, [ 709.121239] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 709.121239] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] vif_infos = vmwarevif.get_vif_info(self._session, [ 709.121239] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 709.121239] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] for vif in network_info: [ 709.121239] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 709.121239] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] return self._sync_wrapper(fn, *args, **kwargs) [ 709.121239] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 709.121239] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] self.wait() [ 709.121239] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 709.121239] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] self[:] = self._gt.wait() [ 709.121239] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 709.121239] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] return self._exit_event.wait() [ 709.121239] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 709.121239] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] result = hub.switch() [ 709.121605] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 709.121605] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] return self.greenlet.switch() [ 709.121605] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 709.121605] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] result = function(*args, **kwargs) [ 709.121605] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 709.121605] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] return func(*args, **kwargs) [ 709.121605] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 709.121605] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] raise e [ 709.121605] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 709.121605] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] nwinfo = self.network_api.allocate_for_instance( [ 709.121605] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 709.121605] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] created_port_ids = self._update_ports_for_instance( [ 709.121605] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 709.121900] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] with excutils.save_and_reraise_exception(): [ 709.121900] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 709.121900] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] self.force_reraise() [ 709.121900] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 709.121900] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] raise self.value [ 709.121900] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 709.121900] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] updated_port = self._update_port( [ 709.121900] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 709.121900] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] _ensure_no_port_binding_failure(port) [ 709.121900] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 709.121900] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] raise exception.PortBindingFailed(port_id=port['id']) [ 709.121900] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] nova.exception.PortBindingFailed: Binding failed for port b8a09541-861d-4a6e-bed9-a46fc8baadf6, please check neutron logs for more information. [ 709.121900] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] [ 709.122198] env[59857]: INFO nova.compute.manager [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] Terminating instance [ 709.123398] env[59857]: DEBUG oslo_concurrency.lockutils [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Acquiring lock "refresh_cache-37331817-f277-4f32-8d5a-11e1cf63f2b7" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 709.123797] env[59857]: DEBUG oslo_concurrency.lockutils [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Acquired lock "refresh_cache-37331817-f277-4f32-8d5a-11e1cf63f2b7" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 709.123990] env[59857]: DEBUG nova.network.neutron [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 709.249726] env[59857]: DEBUG nova.network.neutron [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 709.497436] env[59857]: DEBUG oslo_concurrency.lockutils [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Acquiring lock "6576c530-0b88-453e-bace-70a4f1c76d3c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 709.497436] env[59857]: DEBUG oslo_concurrency.lockutils [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Lock "6576c530-0b88-453e-bace-70a4f1c76d3c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 709.506128] env[59857]: DEBUG nova.compute.manager [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] Starting instance... {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 709.556646] env[59857]: DEBUG oslo_concurrency.lockutils [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 709.556646] env[59857]: DEBUG oslo_concurrency.lockutils [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 709.556646] env[59857]: INFO nova.compute.claims [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 709.749096] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a0243a2-2376-43e1-baed-07e3ca27b064 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 709.757966] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31fa0b22-04e6-4371-9e0b-acbe63111ba5 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 709.789439] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b31438fa-d039-4ba7-a2f8-82e592f4346a {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 709.796705] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96b9795c-6732-4246-bf3b-07ebc97196cc {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 709.811501] env[59857]: DEBUG nova.compute.provider_tree [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 709.823476] env[59857]: DEBUG nova.scheduler.client.report [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 709.841020] env[59857]: DEBUG oslo_concurrency.lockutils [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.284s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 709.841020] env[59857]: DEBUG nova.compute.manager [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] Start building networks asynchronously for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 709.877125] env[59857]: DEBUG nova.compute.utils [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Using /dev/sd instead of None {{(pid=59857) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 709.877628] env[59857]: DEBUG nova.compute.manager [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] Allocating IP information in the background. {{(pid=59857) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 709.877934] env[59857]: DEBUG nova.network.neutron [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] allocate_for_instance() {{(pid=59857) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 709.892541] env[59857]: DEBUG nova.compute.manager [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] Start building block device mappings for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 709.961797] env[59857]: DEBUG nova.compute.manager [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] Start spawning the instance on the hypervisor. {{(pid=59857) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 709.986435] env[59857]: DEBUG nova.virt.hardware [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-23T14:43:14Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-23T14:42:57Z,direct_url=,disk_format='vmdk',id=4a4a4830-1ff7-4cff-ab75-d665942f46b5,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='88755ffc9d394d14928f0692771e61f5',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-23T14:42:58Z,virtual_size=,visibility=), allow threads: False {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 709.986818] env[59857]: DEBUG nova.virt.hardware [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Flavor limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 709.987122] env[59857]: DEBUG nova.virt.hardware [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Image limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 709.987455] env[59857]: DEBUG nova.virt.hardware [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Flavor pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 709.987735] env[59857]: DEBUG nova.virt.hardware [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Image pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 709.988011] env[59857]: DEBUG nova.virt.hardware [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 709.988380] env[59857]: DEBUG nova.virt.hardware [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 709.988869] env[59857]: DEBUG nova.virt.hardware [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 709.989208] env[59857]: DEBUG nova.virt.hardware [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Got 1 possible topologies {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 709.989512] env[59857]: DEBUG nova.virt.hardware [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 709.989819] env[59857]: DEBUG nova.virt.hardware [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 709.991634] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7397a4f5-195b-4a76-bab2-454596360133 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 710.006189] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0f7d37b-c0f9-43c4-8fab-ba1915f9172c {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 710.068296] env[59857]: DEBUG nova.network.neutron [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 710.090620] env[59857]: DEBUG oslo_concurrency.lockutils [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Releasing lock "refresh_cache-37331817-f277-4f32-8d5a-11e1cf63f2b7" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 710.090620] env[59857]: DEBUG nova.compute.manager [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] Start destroying the instance on the hypervisor. {{(pid=59857) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 710.090620] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] Destroying instance {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 710.090620] env[59857]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-dd9b9bce-abc1-4c7b-b2e1-2fcb85f90387 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 710.100319] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bdd8ab92-4f56-45b7-be0a-e9625dbec81d {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 710.127491] env[59857]: WARNING nova.virt.vmwareapi.vmops [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 37331817-f277-4f32-8d5a-11e1cf63f2b7 could not be found. [ 710.127949] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] Instance destroyed {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 710.128456] env[59857]: INFO nova.compute.manager [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] Took 0.04 seconds to destroy the instance on the hypervisor. [ 710.128756] env[59857]: DEBUG oslo.service.loopingcall [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59857) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 710.129852] env[59857]: DEBUG nova.compute.manager [-] [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 710.129852] env[59857]: DEBUG nova.network.neutron [-] [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 710.145220] env[59857]: DEBUG nova.policy [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4be8262f30274ebb9516f2ec280a6a40', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8349984ea81841b8880696d0a1326b35', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59857) authorize /opt/stack/nova/nova/policy.py:203}} [ 710.216883] env[59857]: ERROR nova.compute.manager [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 3fc4c3ad-2447-45d8-941c-973977c7c5b9, please check neutron logs for more information. [ 710.216883] env[59857]: ERROR nova.compute.manager Traceback (most recent call last): [ 710.216883] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 710.216883] env[59857]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 710.216883] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 710.216883] env[59857]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 710.216883] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 710.216883] env[59857]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 710.216883] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 710.216883] env[59857]: ERROR nova.compute.manager self.force_reraise() [ 710.216883] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 710.216883] env[59857]: ERROR nova.compute.manager raise self.value [ 710.216883] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 710.216883] env[59857]: ERROR nova.compute.manager updated_port = self._update_port( [ 710.216883] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 710.216883] env[59857]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 710.217290] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 710.217290] env[59857]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 710.217290] env[59857]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 3fc4c3ad-2447-45d8-941c-973977c7c5b9, please check neutron logs for more information. [ 710.217290] env[59857]: ERROR nova.compute.manager [ 710.217290] env[59857]: Traceback (most recent call last): [ 710.217290] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 710.217290] env[59857]: listener.cb(fileno) [ 710.217290] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 710.217290] env[59857]: result = function(*args, **kwargs) [ 710.217290] env[59857]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 710.217290] env[59857]: return func(*args, **kwargs) [ 710.217290] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 710.217290] env[59857]: raise e [ 710.217290] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 710.217290] env[59857]: nwinfo = self.network_api.allocate_for_instance( [ 710.217290] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 710.217290] env[59857]: created_port_ids = self._update_ports_for_instance( [ 710.217290] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 710.217290] env[59857]: with excutils.save_and_reraise_exception(): [ 710.217290] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 710.217290] env[59857]: self.force_reraise() [ 710.217290] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 710.217290] env[59857]: raise self.value [ 710.217290] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 710.217290] env[59857]: updated_port = self._update_port( [ 710.217290] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 710.217290] env[59857]: _ensure_no_port_binding_failure(port) [ 710.217290] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 710.217290] env[59857]: raise exception.PortBindingFailed(port_id=port['id']) [ 710.217896] env[59857]: nova.exception.PortBindingFailed: Binding failed for port 3fc4c3ad-2447-45d8-941c-973977c7c5b9, please check neutron logs for more information. [ 710.217896] env[59857]: Removing descriptor: 15 [ 710.217896] env[59857]: ERROR nova.compute.manager [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 3fc4c3ad-2447-45d8-941c-973977c7c5b9, please check neutron logs for more information. [ 710.217896] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] Traceback (most recent call last): [ 710.217896] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 710.217896] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] yield resources [ 710.217896] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 710.217896] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] self.driver.spawn(context, instance, image_meta, [ 710.217896] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 710.217896] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 710.217896] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 710.217896] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] vm_ref = self.build_virtual_machine(instance, [ 710.218170] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 710.218170] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] vif_infos = vmwarevif.get_vif_info(self._session, [ 710.218170] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 710.218170] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] for vif in network_info: [ 710.218170] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 710.218170] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] return self._sync_wrapper(fn, *args, **kwargs) [ 710.218170] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 710.218170] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] self.wait() [ 710.218170] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 710.218170] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] self[:] = self._gt.wait() [ 710.218170] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 710.218170] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] return self._exit_event.wait() [ 710.218170] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 710.218170] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] result = hub.switch() [ 710.218503] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 710.218503] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] return self.greenlet.switch() [ 710.218503] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 710.218503] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] result = function(*args, **kwargs) [ 710.218503] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 710.218503] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] return func(*args, **kwargs) [ 710.218503] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 710.218503] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] raise e [ 710.218503] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 710.218503] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] nwinfo = self.network_api.allocate_for_instance( [ 710.218503] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 710.218503] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] created_port_ids = self._update_ports_for_instance( [ 710.218503] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 710.218811] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] with excutils.save_and_reraise_exception(): [ 710.218811] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 710.218811] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] self.force_reraise() [ 710.218811] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 710.218811] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] raise self.value [ 710.218811] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 710.218811] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] updated_port = self._update_port( [ 710.218811] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 710.218811] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] _ensure_no_port_binding_failure(port) [ 710.218811] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 710.218811] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] raise exception.PortBindingFailed(port_id=port['id']) [ 710.218811] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] nova.exception.PortBindingFailed: Binding failed for port 3fc4c3ad-2447-45d8-941c-973977c7c5b9, please check neutron logs for more information. [ 710.218811] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] [ 710.219166] env[59857]: INFO nova.compute.manager [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] Terminating instance [ 710.223015] env[59857]: DEBUG oslo_concurrency.lockutils [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Acquiring lock "refresh_cache-5b59d527-232e-4ef1-bc83-4e8671607db1" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 710.223015] env[59857]: DEBUG oslo_concurrency.lockutils [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Acquired lock "refresh_cache-5b59d527-232e-4ef1-bc83-4e8671607db1" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 710.223015] env[59857]: DEBUG nova.network.neutron [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 710.223015] env[59857]: DEBUG nova.network.neutron [-] [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 710.236037] env[59857]: DEBUG nova.network.neutron [-] [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 710.244267] env[59857]: INFO nova.compute.manager [-] [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] Took 0.11 seconds to deallocate network for instance. [ 710.244685] env[59857]: DEBUG nova.compute.claims [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] Aborting claim: {{(pid=59857) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 710.244864] env[59857]: DEBUG oslo_concurrency.lockutils [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 710.245298] env[59857]: DEBUG oslo_concurrency.lockutils [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 710.355077] env[59857]: DEBUG nova.network.neutron [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 710.478016] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a543dc1-99c3-479b-b9bb-83b79c9befb1 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 710.486138] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-914eca60-414c-46e4-86dc-9c4fb755eb06 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 710.519888] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd98401f-8114-492c-b2d2-05d7c17847e7 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 710.527832] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-820cacea-a2d2-4140-b816-7bb3098aa1b9 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 710.542214] env[59857]: DEBUG nova.compute.provider_tree [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 710.550429] env[59857]: DEBUG nova.scheduler.client.report [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 710.565583] env[59857]: DEBUG oslo_concurrency.lockutils [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.320s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 710.566574] env[59857]: ERROR nova.compute.manager [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port b8a09541-861d-4a6e-bed9-a46fc8baadf6, please check neutron logs for more information. [ 710.566574] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] Traceback (most recent call last): [ 710.566574] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 710.566574] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] self.driver.spawn(context, instance, image_meta, [ 710.566574] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 710.566574] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 710.566574] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 710.566574] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] vm_ref = self.build_virtual_machine(instance, [ 710.566574] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 710.566574] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] vif_infos = vmwarevif.get_vif_info(self._session, [ 710.566574] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 710.567262] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] for vif in network_info: [ 710.567262] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 710.567262] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] return self._sync_wrapper(fn, *args, **kwargs) [ 710.567262] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 710.567262] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] self.wait() [ 710.567262] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 710.567262] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] self[:] = self._gt.wait() [ 710.567262] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 710.567262] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] return self._exit_event.wait() [ 710.567262] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 710.567262] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] result = hub.switch() [ 710.567262] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 710.567262] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] return self.greenlet.switch() [ 710.567262] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 710.567762] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] result = function(*args, **kwargs) [ 710.567762] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 710.567762] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] return func(*args, **kwargs) [ 710.567762] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 710.567762] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] raise e [ 710.567762] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 710.567762] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] nwinfo = self.network_api.allocate_for_instance( [ 710.567762] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 710.567762] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] created_port_ids = self._update_ports_for_instance( [ 710.567762] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 710.567762] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] with excutils.save_and_reraise_exception(): [ 710.567762] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 710.567762] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] self.force_reraise() [ 710.568231] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 710.568231] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] raise self.value [ 710.568231] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 710.568231] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] updated_port = self._update_port( [ 710.568231] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 710.568231] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] _ensure_no_port_binding_failure(port) [ 710.568231] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 710.568231] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] raise exception.PortBindingFailed(port_id=port['id']) [ 710.568231] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] nova.exception.PortBindingFailed: Binding failed for port b8a09541-861d-4a6e-bed9-a46fc8baadf6, please check neutron logs for more information. [ 710.568231] env[59857]: ERROR nova.compute.manager [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] [ 710.568570] env[59857]: DEBUG nova.compute.utils [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] Binding failed for port b8a09541-861d-4a6e-bed9-a46fc8baadf6, please check neutron logs for more information. {{(pid=59857) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 710.568915] env[59857]: DEBUG nova.compute.manager [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] Build of instance 37331817-f277-4f32-8d5a-11e1cf63f2b7 was re-scheduled: Binding failed for port b8a09541-861d-4a6e-bed9-a46fc8baadf6, please check neutron logs for more information. {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 710.569333] env[59857]: DEBUG nova.compute.manager [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] Unplugging VIFs for instance {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 710.569550] env[59857]: DEBUG oslo_concurrency.lockutils [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Acquiring lock "refresh_cache-37331817-f277-4f32-8d5a-11e1cf63f2b7" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 710.569688] env[59857]: DEBUG oslo_concurrency.lockutils [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Acquired lock "refresh_cache-37331817-f277-4f32-8d5a-11e1cf63f2b7" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 710.569842] env[59857]: DEBUG nova.network.neutron [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 710.696116] env[59857]: DEBUG nova.network.neutron [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 711.249055] env[59857]: DEBUG nova.network.neutron [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] Successfully created port: b475fa57-76c3-4f1c-a1bf-fd6c13cd4193 {{(pid=59857) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 711.260401] env[59857]: DEBUG nova.network.neutron [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 711.268953] env[59857]: DEBUG oslo_concurrency.lockutils [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Releasing lock "refresh_cache-5b59d527-232e-4ef1-bc83-4e8671607db1" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 711.269357] env[59857]: DEBUG nova.compute.manager [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] Start destroying the instance on the hypervisor. {{(pid=59857) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 711.269547] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] Destroying instance {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 711.270076] env[59857]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-d6983d99-ffcb-481f-82a8-49d34a031b62 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 711.279454] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7185a6ca-9fdc-403a-9c69-3346a57ee392 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 711.302878] env[59857]: WARNING nova.virt.vmwareapi.vmops [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 5b59d527-232e-4ef1-bc83-4e8671607db1 could not be found. [ 711.303129] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] Instance destroyed {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 711.303292] env[59857]: INFO nova.compute.manager [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] Took 0.03 seconds to destroy the instance on the hypervisor. [ 711.303520] env[59857]: DEBUG oslo.service.loopingcall [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59857) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 711.303727] env[59857]: DEBUG nova.compute.manager [-] [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 711.303866] env[59857]: DEBUG nova.network.neutron [-] [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 711.418711] env[59857]: DEBUG nova.network.neutron [-] [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 711.430294] env[59857]: DEBUG nova.network.neutron [-] [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 711.441826] env[59857]: INFO nova.compute.manager [-] [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] Took 0.14 seconds to deallocate network for instance. [ 711.446824] env[59857]: DEBUG nova.compute.claims [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] Aborting claim: {{(pid=59857) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 711.447320] env[59857]: DEBUG oslo_concurrency.lockutils [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 711.447666] env[59857]: DEBUG oslo_concurrency.lockutils [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 711.592967] env[59857]: DEBUG nova.network.neutron [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 711.602564] env[59857]: DEBUG oslo_concurrency.lockutils [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Releasing lock "refresh_cache-37331817-f277-4f32-8d5a-11e1cf63f2b7" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 711.602782] env[59857]: DEBUG nova.compute.manager [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 711.602954] env[59857]: DEBUG nova.compute.manager [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 711.603886] env[59857]: DEBUG nova.network.neutron [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 711.693560] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb9e6312-d6b4-4896-ae4a-37ca4e83d4a9 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 711.703582] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e80fdbd2-c400-4253-91f9-2835f6ba74bc {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 711.738702] env[59857]: DEBUG nova.network.neutron [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 711.742345] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ccbf916c-5d32-4f7c-a314-b2ac5ffd5f51 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 711.750403] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-779f0b60-a791-4ec8-b2ec-ed37efd5b550 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 711.755027] env[59857]: DEBUG nova.network.neutron [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 711.768262] env[59857]: DEBUG nova.compute.provider_tree [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 711.769449] env[59857]: INFO nova.compute.manager [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] Took 0.17 seconds to deallocate network for instance. [ 711.779342] env[59857]: DEBUG nova.scheduler.client.report [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 711.792466] env[59857]: DEBUG oslo_concurrency.lockutils [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.345s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 711.794710] env[59857]: ERROR nova.compute.manager [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 3fc4c3ad-2447-45d8-941c-973977c7c5b9, please check neutron logs for more information. [ 711.794710] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] Traceback (most recent call last): [ 711.794710] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 711.794710] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] self.driver.spawn(context, instance, image_meta, [ 711.794710] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 711.794710] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 711.794710] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 711.794710] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] vm_ref = self.build_virtual_machine(instance, [ 711.794710] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 711.794710] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] vif_infos = vmwarevif.get_vif_info(self._session, [ 711.794710] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 711.794993] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] for vif in network_info: [ 711.794993] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 711.794993] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] return self._sync_wrapper(fn, *args, **kwargs) [ 711.794993] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 711.794993] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] self.wait() [ 711.794993] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 711.794993] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] self[:] = self._gt.wait() [ 711.794993] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 711.794993] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] return self._exit_event.wait() [ 711.794993] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 711.794993] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] result = hub.switch() [ 711.794993] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 711.794993] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] return self.greenlet.switch() [ 711.794993] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 711.795372] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] result = function(*args, **kwargs) [ 711.795372] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 711.795372] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] return func(*args, **kwargs) [ 711.795372] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 711.795372] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] raise e [ 711.795372] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 711.795372] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] nwinfo = self.network_api.allocate_for_instance( [ 711.795372] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 711.795372] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] created_port_ids = self._update_ports_for_instance( [ 711.795372] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 711.795372] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] with excutils.save_and_reraise_exception(): [ 711.795372] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 711.795372] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] self.force_reraise() [ 711.795718] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 711.795718] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] raise self.value [ 711.795718] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 711.795718] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] updated_port = self._update_port( [ 711.795718] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 711.795718] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] _ensure_no_port_binding_failure(port) [ 711.795718] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 711.795718] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] raise exception.PortBindingFailed(port_id=port['id']) [ 711.795718] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] nova.exception.PortBindingFailed: Binding failed for port 3fc4c3ad-2447-45d8-941c-973977c7c5b9, please check neutron logs for more information. [ 711.795718] env[59857]: ERROR nova.compute.manager [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] [ 711.795718] env[59857]: DEBUG nova.compute.utils [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] Binding failed for port 3fc4c3ad-2447-45d8-941c-973977c7c5b9, please check neutron logs for more information. {{(pid=59857) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 711.796022] env[59857]: DEBUG nova.compute.manager [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] Build of instance 5b59d527-232e-4ef1-bc83-4e8671607db1 was re-scheduled: Binding failed for port 3fc4c3ad-2447-45d8-941c-973977c7c5b9, please check neutron logs for more information. {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 711.796022] env[59857]: DEBUG nova.compute.manager [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] Unplugging VIFs for instance {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 711.796113] env[59857]: DEBUG oslo_concurrency.lockutils [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Acquiring lock "refresh_cache-5b59d527-232e-4ef1-bc83-4e8671607db1" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 711.796221] env[59857]: DEBUG oslo_concurrency.lockutils [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Acquired lock "refresh_cache-5b59d527-232e-4ef1-bc83-4e8671607db1" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 711.796402] env[59857]: DEBUG nova.network.neutron [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 711.859487] env[59857]: DEBUG nova.network.neutron [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] Successfully created port: ef85a30e-1521-4200-a4af-6db739713904 {{(pid=59857) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 711.903412] env[59857]: INFO nova.scheduler.client.report [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Deleted allocations for instance 37331817-f277-4f32-8d5a-11e1cf63f2b7 [ 711.908473] env[59857]: DEBUG oslo_concurrency.lockutils [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Acquiring lock "0fc5de88-13a6-498a-848c-35beb772be65" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 711.908687] env[59857]: DEBUG oslo_concurrency.lockutils [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Lock "0fc5de88-13a6-498a-848c-35beb772be65" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 711.921781] env[59857]: DEBUG oslo_concurrency.lockutils [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Lock "37331817-f277-4f32-8d5a-11e1cf63f2b7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 11.512s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 711.942523] env[59857]: DEBUG nova.compute.manager [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] Starting instance... {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 711.989340] env[59857]: DEBUG oslo_concurrency.lockutils [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 711.989582] env[59857]: DEBUG oslo_concurrency.lockutils [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 711.991033] env[59857]: INFO nova.compute.claims [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 712.114314] env[59857]: DEBUG nova.network.neutron [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 712.168968] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-950f8817-1bd2-49b4-b701-fc90e371e056 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 712.181579] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f16c418f-55f1-4e0c-9899-950223274713 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 712.211761] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d472d97d-efe5-4ae8-bbb3-aaa09be4e431 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 712.219147] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6aaeb6fd-2408-4b0e-b33e-84eeffad4236 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 712.232165] env[59857]: DEBUG nova.compute.provider_tree [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 712.241338] env[59857]: DEBUG nova.scheduler.client.report [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 712.260020] env[59857]: DEBUG oslo_concurrency.lockutils [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.268s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 712.260020] env[59857]: DEBUG nova.compute.manager [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] Start building networks asynchronously for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 712.297320] env[59857]: DEBUG nova.compute.utils [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Using /dev/sd instead of None {{(pid=59857) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 712.298893] env[59857]: DEBUG nova.compute.manager [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] Allocating IP information in the background. {{(pid=59857) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 712.298893] env[59857]: DEBUG nova.network.neutron [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] allocate_for_instance() {{(pid=59857) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 712.310016] env[59857]: DEBUG nova.compute.manager [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] Start building block device mappings for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 712.398860] env[59857]: DEBUG nova.compute.manager [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] Start spawning the instance on the hypervisor. {{(pid=59857) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 712.425684] env[59857]: DEBUG nova.virt.hardware [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-23T14:43:14Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-23T14:42:57Z,direct_url=,disk_format='vmdk',id=4a4a4830-1ff7-4cff-ab75-d665942f46b5,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='88755ffc9d394d14928f0692771e61f5',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-23T14:42:58Z,virtual_size=,visibility=), allow threads: False {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 712.426030] env[59857]: DEBUG nova.virt.hardware [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Flavor limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 712.426159] env[59857]: DEBUG nova.virt.hardware [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Image limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 712.426748] env[59857]: DEBUG nova.virt.hardware [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Flavor pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 712.426748] env[59857]: DEBUG nova.virt.hardware [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Image pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 712.426748] env[59857]: DEBUG nova.virt.hardware [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 712.426880] env[59857]: DEBUG nova.virt.hardware [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 712.426956] env[59857]: DEBUG nova.virt.hardware [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 712.427133] env[59857]: DEBUG nova.virt.hardware [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Got 1 possible topologies {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 712.427298] env[59857]: DEBUG nova.virt.hardware [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 712.427502] env[59857]: DEBUG nova.virt.hardware [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 712.429209] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6318e87-b2d7-40b2-bc9e-edc187471707 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 712.440036] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ddcb07fe-4b81-4219-9e71-ab55258c702f {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 712.635759] env[59857]: DEBUG nova.policy [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '108982ac95244915883daf6a6b4b7f35', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '257f2e74ff3341ffbc4982cb07c324fa', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59857) authorize /opt/stack/nova/nova/policy.py:203}} [ 712.771986] env[59857]: DEBUG nova.network.neutron [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 712.780911] env[59857]: DEBUG oslo_concurrency.lockutils [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Releasing lock "refresh_cache-5b59d527-232e-4ef1-bc83-4e8671607db1" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 712.781132] env[59857]: DEBUG nova.compute.manager [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 712.781308] env[59857]: DEBUG nova.compute.manager [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 712.781465] env[59857]: DEBUG nova.network.neutron [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 713.084999] env[59857]: DEBUG nova.network.neutron [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 713.093576] env[59857]: DEBUG nova.network.neutron [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 713.102368] env[59857]: INFO nova.compute.manager [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] Took 0.32 seconds to deallocate network for instance. [ 713.209825] env[59857]: INFO nova.scheduler.client.report [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Deleted allocations for instance 5b59d527-232e-4ef1-bc83-4e8671607db1 [ 713.240587] env[59857]: DEBUG oslo_concurrency.lockutils [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Lock "5b59d527-232e-4ef1-bc83-4e8671607db1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 12.641s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 713.486168] env[59857]: DEBUG oslo_concurrency.lockutils [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Acquiring lock "c573864b-774a-4e4d-be80-5bc9bbd1659d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 713.486168] env[59857]: DEBUG oslo_concurrency.lockutils [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Lock "c573864b-774a-4e4d-be80-5bc9bbd1659d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 713.503020] env[59857]: DEBUG nova.compute.manager [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] Starting instance... {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 713.557051] env[59857]: DEBUG oslo_concurrency.lockutils [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 713.557051] env[59857]: DEBUG oslo_concurrency.lockutils [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 713.557461] env[59857]: INFO nova.compute.claims [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 713.817850] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc4d93b8-9f1b-45ab-aafb-759b77bc8d4a {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 713.826281] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9c32301-b881-4a00-aa09-1a29cf83fed9 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 713.865781] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f9e1942-a864-4b72-9ace-725a388556ae {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 713.870825] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8dd65512-55b7-4ecd-958d-142ab59271aa {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 713.887879] env[59857]: DEBUG nova.compute.provider_tree [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 713.898331] env[59857]: DEBUG nova.scheduler.client.report [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 713.928735] env[59857]: DEBUG oslo_concurrency.lockutils [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.373s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 713.929252] env[59857]: DEBUG nova.compute.manager [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] Start building networks asynchronously for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 713.974762] env[59857]: DEBUG nova.compute.utils [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Using /dev/sd instead of None {{(pid=59857) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 713.975296] env[59857]: DEBUG nova.compute.manager [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] Allocating IP information in the background. {{(pid=59857) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 713.975790] env[59857]: DEBUG nova.network.neutron [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] allocate_for_instance() {{(pid=59857) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 713.986026] env[59857]: DEBUG nova.compute.manager [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] Start building block device mappings for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 714.066527] env[59857]: DEBUG nova.compute.manager [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] Start spawning the instance on the hypervisor. {{(pid=59857) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 714.091565] env[59857]: DEBUG nova.virt.hardware [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-23T14:43:14Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-23T14:42:57Z,direct_url=,disk_format='vmdk',id=4a4a4830-1ff7-4cff-ab75-d665942f46b5,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='88755ffc9d394d14928f0692771e61f5',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-23T14:42:58Z,virtual_size=,visibility=), allow threads: False {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 714.091718] env[59857]: DEBUG nova.virt.hardware [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Flavor limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 714.091868] env[59857]: DEBUG nova.virt.hardware [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Image limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 714.092106] env[59857]: DEBUG nova.virt.hardware [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Flavor pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 714.092256] env[59857]: DEBUG nova.virt.hardware [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Image pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 714.092428] env[59857]: DEBUG nova.virt.hardware [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 714.093533] env[59857]: DEBUG nova.virt.hardware [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 714.093533] env[59857]: DEBUG nova.virt.hardware [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 714.093533] env[59857]: DEBUG nova.virt.hardware [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Got 1 possible topologies {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 714.094226] env[59857]: DEBUG nova.virt.hardware [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 714.094442] env[59857]: DEBUG nova.virt.hardware [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 714.095656] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5bc0ad61-e14c-4a6a-a4a6-253f45bc22ce {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.107116] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c04d3bd1-a3a8-481b-adb2-d5df7bfb709b {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.140923] env[59857]: WARNING oslo_vmware.rw_handles [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 714.140923] env[59857]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 714.140923] env[59857]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 714.140923] env[59857]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 714.140923] env[59857]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 714.140923] env[59857]: ERROR oslo_vmware.rw_handles response.begin() [ 714.140923] env[59857]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 714.140923] env[59857]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 714.140923] env[59857]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 714.140923] env[59857]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 714.140923] env[59857]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 714.140923] env[59857]: ERROR oslo_vmware.rw_handles [ 714.141270] env[59857]: DEBUG nova.virt.vmwareapi.images [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] Downloaded image file data 4a4a4830-1ff7-4cff-ab75-d665942f46b5 to vmware_temp/dfbc6502-9da6-4117-97e8-4440e0c602eb/4a4a4830-1ff7-4cff-ab75-d665942f46b5/tmp-sparse.vmdk on the data store datastore2 {{(pid=59857) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 714.142844] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] Caching image {{(pid=59857) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 714.143099] env[59857]: DEBUG nova.virt.vmwareapi.vm_util [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Copying Virtual Disk [datastore2] vmware_temp/dfbc6502-9da6-4117-97e8-4440e0c602eb/4a4a4830-1ff7-4cff-ab75-d665942f46b5/tmp-sparse.vmdk to [datastore2] vmware_temp/dfbc6502-9da6-4117-97e8-4440e0c602eb/4a4a4830-1ff7-4cff-ab75-d665942f46b5/4a4a4830-1ff7-4cff-ab75-d665942f46b5.vmdk {{(pid=59857) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 714.143457] env[59857]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-c3e69fdc-4b42-4ede-826f-a10d0b9c8d81 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.153476] env[59857]: DEBUG oslo_vmware.api [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Waiting for the task: (returnval){ [ 714.153476] env[59857]: value = "task-1341428" [ 714.153476] env[59857]: _type = "Task" [ 714.153476] env[59857]: } to complete. {{(pid=59857) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 714.159099] env[59857]: DEBUG oslo_vmware.api [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Task: {'id': task-1341428, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59857) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 714.449161] env[59857]: ERROR nova.compute.manager [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 879190bd-d68a-455e-abd2-73e8f85c3e28, please check neutron logs for more information. [ 714.449161] env[59857]: ERROR nova.compute.manager Traceback (most recent call last): [ 714.449161] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 714.449161] env[59857]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 714.449161] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 714.449161] env[59857]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 714.449161] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 714.449161] env[59857]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 714.449161] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 714.449161] env[59857]: ERROR nova.compute.manager self.force_reraise() [ 714.449161] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 714.449161] env[59857]: ERROR nova.compute.manager raise self.value [ 714.449161] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 714.449161] env[59857]: ERROR nova.compute.manager updated_port = self._update_port( [ 714.449161] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 714.449161] env[59857]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 714.449600] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 714.449600] env[59857]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 714.449600] env[59857]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 879190bd-d68a-455e-abd2-73e8f85c3e28, please check neutron logs for more information. [ 714.449600] env[59857]: ERROR nova.compute.manager [ 714.449600] env[59857]: Traceback (most recent call last): [ 714.449600] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 714.449600] env[59857]: listener.cb(fileno) [ 714.449600] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 714.449600] env[59857]: result = function(*args, **kwargs) [ 714.449600] env[59857]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 714.449600] env[59857]: return func(*args, **kwargs) [ 714.449600] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 714.449600] env[59857]: raise e [ 714.449600] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 714.449600] env[59857]: nwinfo = self.network_api.allocate_for_instance( [ 714.449600] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 714.449600] env[59857]: created_port_ids = self._update_ports_for_instance( [ 714.449600] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 714.449600] env[59857]: with excutils.save_and_reraise_exception(): [ 714.449600] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 714.449600] env[59857]: self.force_reraise() [ 714.449600] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 714.449600] env[59857]: raise self.value [ 714.449600] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 714.449600] env[59857]: updated_port = self._update_port( [ 714.449600] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 714.449600] env[59857]: _ensure_no_port_binding_failure(port) [ 714.449600] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 714.449600] env[59857]: raise exception.PortBindingFailed(port_id=port['id']) [ 714.450348] env[59857]: nova.exception.PortBindingFailed: Binding failed for port 879190bd-d68a-455e-abd2-73e8f85c3e28, please check neutron logs for more information. [ 714.450348] env[59857]: Removing descriptor: 20 [ 714.450348] env[59857]: ERROR nova.compute.manager [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 879190bd-d68a-455e-abd2-73e8f85c3e28, please check neutron logs for more information. [ 714.450348] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] Traceback (most recent call last): [ 714.450348] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 714.450348] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] yield resources [ 714.450348] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 714.450348] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] self.driver.spawn(context, instance, image_meta, [ 714.450348] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 714.450348] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] self._vmops.spawn(context, instance, image_meta, injected_files, [ 714.450348] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 714.450348] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] vm_ref = self.build_virtual_machine(instance, [ 714.450666] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 714.450666] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] vif_infos = vmwarevif.get_vif_info(self._session, [ 714.450666] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 714.450666] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] for vif in network_info: [ 714.450666] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 714.450666] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] return self._sync_wrapper(fn, *args, **kwargs) [ 714.450666] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 714.450666] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] self.wait() [ 714.450666] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 714.450666] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] self[:] = self._gt.wait() [ 714.450666] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 714.450666] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] return self._exit_event.wait() [ 714.450666] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 714.450666] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] result = hub.switch() [ 714.451047] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 714.451047] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] return self.greenlet.switch() [ 714.451047] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 714.451047] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] result = function(*args, **kwargs) [ 714.451047] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 714.451047] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] return func(*args, **kwargs) [ 714.451047] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 714.451047] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] raise e [ 714.451047] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 714.451047] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] nwinfo = self.network_api.allocate_for_instance( [ 714.451047] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 714.451047] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] created_port_ids = self._update_ports_for_instance( [ 714.451047] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 714.451447] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] with excutils.save_and_reraise_exception(): [ 714.451447] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 714.451447] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] self.force_reraise() [ 714.451447] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 714.451447] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] raise self.value [ 714.451447] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 714.451447] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] updated_port = self._update_port( [ 714.451447] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 714.451447] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] _ensure_no_port_binding_failure(port) [ 714.451447] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 714.451447] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] raise exception.PortBindingFailed(port_id=port['id']) [ 714.451447] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] nova.exception.PortBindingFailed: Binding failed for port 879190bd-d68a-455e-abd2-73e8f85c3e28, please check neutron logs for more information. [ 714.451447] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] [ 714.451789] env[59857]: INFO nova.compute.manager [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] Terminating instance [ 714.451789] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Acquiring lock "refresh_cache-29c85bbf-553e-4b82-ad7c-5341ffc5af63" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 714.451789] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Acquired lock "refresh_cache-29c85bbf-553e-4b82-ad7c-5341ffc5af63" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 714.451789] env[59857]: DEBUG nova.network.neutron [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 714.514212] env[59857]: DEBUG nova.policy [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0f2d798fa67b4212ba9b0cba90b00820', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6f4d626262ba4bdf905536d0a5919b61', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59857) authorize /opt/stack/nova/nova/policy.py:203}} [ 714.527104] env[59857]: DEBUG nova.network.neutron [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 714.662014] env[59857]: DEBUG oslo_vmware.exceptions [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Fault InvalidArgument not matched. {{(pid=59857) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 714.662318] env[59857]: DEBUG oslo_concurrency.lockutils [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Releasing lock "[datastore2] devstack-image-cache_base/4a4a4830-1ff7-4cff-ab75-d665942f46b5/4a4a4830-1ff7-4cff-ab75-d665942f46b5.vmdk" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 714.662857] env[59857]: ERROR nova.compute.manager [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 714.662857] env[59857]: Faults: ['InvalidArgument'] [ 714.662857] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] Traceback (most recent call last): [ 714.662857] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 714.662857] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] yield resources [ 714.662857] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 714.662857] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] self.driver.spawn(context, instance, image_meta, [ 714.662857] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 714.662857] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] self._vmops.spawn(context, instance, image_meta, injected_files, [ 714.662857] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 714.662857] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] self._fetch_image_if_missing(context, vi) [ 714.662857] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 714.663355] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] image_cache(vi, tmp_image_ds_loc) [ 714.663355] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 714.663355] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] vm_util.copy_virtual_disk( [ 714.663355] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 714.663355] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] session._wait_for_task(vmdk_copy_task) [ 714.663355] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 714.663355] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] return self.wait_for_task(task_ref) [ 714.663355] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 714.663355] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] return evt.wait() [ 714.663355] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 714.663355] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] result = hub.switch() [ 714.663355] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 714.663355] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] return self.greenlet.switch() [ 714.663842] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 714.663842] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] self.f(*self.args, **self.kw) [ 714.663842] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 714.663842] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] raise exceptions.translate_fault(task_info.error) [ 714.663842] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 714.663842] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] Faults: ['InvalidArgument'] [ 714.663842] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] [ 714.663842] env[59857]: INFO nova.compute.manager [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] Terminating instance [ 714.664781] env[59857]: DEBUG oslo_concurrency.lockutils [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Acquired lock "[datastore2] devstack-image-cache_base/4a4a4830-1ff7-4cff-ab75-d665942f46b5/4a4a4830-1ff7-4cff-ab75-d665942f46b5.vmdk" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 714.664978] env[59857]: DEBUG nova.virt.vmwareapi.ds_util [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=59857) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 714.665481] env[59857]: DEBUG oslo_concurrency.lockutils [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Acquiring lock "refresh_cache-b7ab8792-137d-4053-9df9-3d560aa5e411" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 714.665626] env[59857]: DEBUG oslo_concurrency.lockutils [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Acquired lock "refresh_cache-b7ab8792-137d-4053-9df9-3d560aa5e411" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 714.665784] env[59857]: DEBUG nova.network.neutron [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 714.666708] env[59857]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-13e36754-c530-4176-a31e-f250381aea2b {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.680434] env[59857]: DEBUG nova.virt.vmwareapi.ds_util [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=59857) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 714.680618] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=59857) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 714.681719] env[59857]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-61a0ee3a-5bde-40a6-b686-386dbd7b6e21 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.693876] env[59857]: DEBUG oslo_vmware.api [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Waiting for the task: (returnval){ [ 714.693876] env[59857]: value = "session[5271f050-9b9a-180d-7d1b-498ca49bf02c]5207c7d1-d3c8-c415-805e-78af5616c46e" [ 714.693876] env[59857]: _type = "Task" [ 714.693876] env[59857]: } to complete. {{(pid=59857) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 714.706927] env[59857]: DEBUG oslo_vmware.api [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Task: {'id': session[5271f050-9b9a-180d-7d1b-498ca49bf02c]5207c7d1-d3c8-c415-805e-78af5616c46e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59857) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 714.778065] env[59857]: DEBUG nova.network.neutron [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 715.012168] env[59857]: ERROR nova.compute.manager [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 301ba51c-e66d-4f6a-b589-40341a138ddf, please check neutron logs for more information. [ 715.012168] env[59857]: ERROR nova.compute.manager Traceback (most recent call last): [ 715.012168] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 715.012168] env[59857]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 715.012168] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 715.012168] env[59857]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 715.012168] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 715.012168] env[59857]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 715.012168] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 715.012168] env[59857]: ERROR nova.compute.manager self.force_reraise() [ 715.012168] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 715.012168] env[59857]: ERROR nova.compute.manager raise self.value [ 715.012168] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 715.012168] env[59857]: ERROR nova.compute.manager updated_port = self._update_port( [ 715.012168] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 715.012168] env[59857]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 715.012819] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 715.012819] env[59857]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 715.012819] env[59857]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 301ba51c-e66d-4f6a-b589-40341a138ddf, please check neutron logs for more information. [ 715.012819] env[59857]: ERROR nova.compute.manager [ 715.012819] env[59857]: Traceback (most recent call last): [ 715.012819] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 715.012819] env[59857]: listener.cb(fileno) [ 715.012819] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 715.012819] env[59857]: result = function(*args, **kwargs) [ 715.012819] env[59857]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 715.012819] env[59857]: return func(*args, **kwargs) [ 715.012819] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 715.012819] env[59857]: raise e [ 715.012819] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 715.012819] env[59857]: nwinfo = self.network_api.allocate_for_instance( [ 715.012819] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 715.012819] env[59857]: created_port_ids = self._update_ports_for_instance( [ 715.012819] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 715.012819] env[59857]: with excutils.save_and_reraise_exception(): [ 715.012819] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 715.012819] env[59857]: self.force_reraise() [ 715.012819] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 715.012819] env[59857]: raise self.value [ 715.012819] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 715.012819] env[59857]: updated_port = self._update_port( [ 715.012819] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 715.012819] env[59857]: _ensure_no_port_binding_failure(port) [ 715.012819] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 715.012819] env[59857]: raise exception.PortBindingFailed(port_id=port['id']) [ 715.014139] env[59857]: nova.exception.PortBindingFailed: Binding failed for port 301ba51c-e66d-4f6a-b589-40341a138ddf, please check neutron logs for more information. [ 715.014139] env[59857]: Removing descriptor: 12 [ 715.014139] env[59857]: ERROR nova.compute.manager [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 301ba51c-e66d-4f6a-b589-40341a138ddf, please check neutron logs for more information. [ 715.014139] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] Traceback (most recent call last): [ 715.014139] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 715.014139] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] yield resources [ 715.014139] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 715.014139] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] self.driver.spawn(context, instance, image_meta, [ 715.014139] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 715.014139] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 715.014139] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 715.014139] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] vm_ref = self.build_virtual_machine(instance, [ 715.014613] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 715.014613] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] vif_infos = vmwarevif.get_vif_info(self._session, [ 715.014613] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 715.014613] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] for vif in network_info: [ 715.014613] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 715.014613] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] return self._sync_wrapper(fn, *args, **kwargs) [ 715.014613] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 715.014613] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] self.wait() [ 715.014613] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 715.014613] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] self[:] = self._gt.wait() [ 715.014613] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 715.014613] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] return self._exit_event.wait() [ 715.014613] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 715.014613] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] result = hub.switch() [ 715.014996] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 715.014996] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] return self.greenlet.switch() [ 715.014996] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 715.014996] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] result = function(*args, **kwargs) [ 715.014996] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 715.014996] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] return func(*args, **kwargs) [ 715.014996] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 715.014996] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] raise e [ 715.014996] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 715.014996] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] nwinfo = self.network_api.allocate_for_instance( [ 715.014996] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 715.014996] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] created_port_ids = self._update_ports_for_instance( [ 715.014996] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 715.015370] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] with excutils.save_and_reraise_exception(): [ 715.015370] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 715.015370] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] self.force_reraise() [ 715.015370] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 715.015370] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] raise self.value [ 715.015370] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 715.015370] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] updated_port = self._update_port( [ 715.015370] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 715.015370] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] _ensure_no_port_binding_failure(port) [ 715.015370] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 715.015370] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] raise exception.PortBindingFailed(port_id=port['id']) [ 715.015370] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] nova.exception.PortBindingFailed: Binding failed for port 301ba51c-e66d-4f6a-b589-40341a138ddf, please check neutron logs for more information. [ 715.015370] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] [ 715.015716] env[59857]: INFO nova.compute.manager [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] Terminating instance [ 715.016597] env[59857]: DEBUG oslo_concurrency.lockutils [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "refresh_cache-5ad1fabc-bae4-47cb-9b27-42c86c4b02e8" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 715.016741] env[59857]: DEBUG oslo_concurrency.lockutils [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquired lock "refresh_cache-5ad1fabc-bae4-47cb-9b27-42c86c4b02e8" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 715.016900] env[59857]: DEBUG nova.network.neutron [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 715.119675] env[59857]: DEBUG nova.network.neutron [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 715.130591] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Releasing lock "refresh_cache-29c85bbf-553e-4b82-ad7c-5341ffc5af63" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 715.131024] env[59857]: DEBUG nova.compute.manager [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] Start destroying the instance on the hypervisor. {{(pid=59857) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 715.131231] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] Destroying instance {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 715.131752] env[59857]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-094ded4f-278e-40d5-b81b-3d63e96964ef {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.137587] env[59857]: DEBUG nova.network.neutron [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 715.144511] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c51176e-c03d-43e4-a7d1-cb4b3da576fc {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.174513] env[59857]: WARNING nova.virt.vmwareapi.vmops [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 29c85bbf-553e-4b82-ad7c-5341ffc5af63 could not be found. [ 715.175212] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] Instance destroyed {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 715.178277] env[59857]: INFO nova.compute.manager [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] Took 0.04 seconds to destroy the instance on the hypervisor. [ 715.178857] env[59857]: DEBUG oslo.service.loopingcall [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59857) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 715.179120] env[59857]: DEBUG nova.compute.manager [-] [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 715.179223] env[59857]: DEBUG nova.network.neutron [-] [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 715.204739] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Preparing fetch location {{(pid=59857) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 715.204922] env[59857]: DEBUG nova.virt.vmwareapi.ds_util [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Creating directory with path [datastore2] vmware_temp/8c5861da-de71-4c66-a091-67682bd79c41/4a4a4830-1ff7-4cff-ab75-d665942f46b5 {{(pid=59857) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 715.205202] env[59857]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f2324873-5427-4e3a-ab31-d8da2d2213d8 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.230101] env[59857]: DEBUG nova.virt.vmwareapi.ds_util [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Created directory with path [datastore2] vmware_temp/8c5861da-de71-4c66-a091-67682bd79c41/4a4a4830-1ff7-4cff-ab75-d665942f46b5 {{(pid=59857) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 715.230101] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Fetch image to [datastore2] vmware_temp/8c5861da-de71-4c66-a091-67682bd79c41/4a4a4830-1ff7-4cff-ab75-d665942f46b5/tmp-sparse.vmdk {{(pid=59857) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 715.230101] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Downloading image file data 4a4a4830-1ff7-4cff-ab75-d665942f46b5 to [datastore2] vmware_temp/8c5861da-de71-4c66-a091-67682bd79c41/4a4a4830-1ff7-4cff-ab75-d665942f46b5/tmp-sparse.vmdk on the data store datastore2 {{(pid=59857) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 715.230101] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77333468-2798-4dc9-96dd-2931fc02abd0 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.237557] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb01a496-ba0b-4fd8-a8dd-2fa1c94d9c25 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.240999] env[59857]: DEBUG nova.network.neutron [-] [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 715.249981] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f60881d1-5f11-4cd1-bb4b-c487498dffb0 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.254784] env[59857]: DEBUG nova.network.neutron [-] [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 715.289639] env[59857]: INFO nova.compute.manager [-] [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] Took 0.11 seconds to deallocate network for instance. [ 715.290449] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b316dec-07bf-474b-960b-62e48680c3ad {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.295330] env[59857]: DEBUG nova.compute.claims [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] Aborting claim: {{(pid=59857) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 715.295629] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 715.295870] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 715.301505] env[59857]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-af500c9b-4b44-42d4-b870-37fcd2c72c6c {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.326328] env[59857]: DEBUG nova.virt.vmwareapi.images [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Downloading image file data 4a4a4830-1ff7-4cff-ab75-d665942f46b5 to the data store datastore2 {{(pid=59857) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 715.394456] env[59857]: DEBUG oslo_vmware.rw_handles [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8c5861da-de71-4c66-a091-67682bd79c41/4a4a4830-1ff7-4cff-ab75-d665942f46b5/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59857) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 715.457943] env[59857]: DEBUG oslo_vmware.rw_handles [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Completed reading data from the image iterator. {{(pid=59857) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 715.458054] env[59857]: DEBUG oslo_vmware.rw_handles [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8c5861da-de71-4c66-a091-67682bd79c41/4a4a4830-1ff7-4cff-ab75-d665942f46b5/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59857) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 715.461497] env[59857]: DEBUG nova.network.neutron [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] Successfully created port: 78685b50-e23c-45ff-8b69-8ac77b285c14 {{(pid=59857) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 715.539166] env[59857]: DEBUG nova.network.neutron [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 715.551088] env[59857]: DEBUG oslo_concurrency.lockutils [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Releasing lock "refresh_cache-b7ab8792-137d-4053-9df9-3d560aa5e411" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 715.551403] env[59857]: DEBUG nova.compute.manager [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] Start destroying the instance on the hypervisor. {{(pid=59857) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 715.551583] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] Destroying instance {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 715.552748] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c40d6acf-235a-4b9f-80de-4ff67fc72c49 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.562229] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] Unregistering the VM {{(pid=59857) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 715.564160] env[59857]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-553d19bf-2f5a-47a4-bb73-576612541fb9 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.579016] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a684eb88-4aa0-48c0-8e44-0d028ab6845f {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.587548] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35b236b0-0e84-4d62-b74b-627bde7dee9b {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.592930] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] Unregistered the VM {{(pid=59857) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 715.592930] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] Deleting contents of the VM from datastore datastore2 {{(pid=59857) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 715.592930] env[59857]: DEBUG nova.virt.vmwareapi.ds_util [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Deleting the datastore file [datastore2] b7ab8792-137d-4053-9df9-3d560aa5e411 {{(pid=59857) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 715.593261] env[59857]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-9de66757-b8ac-4642-b753-baceb6491a04 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.623720] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89073bc1-777a-4e90-9e5e-e8d8706d22ae {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.628320] env[59857]: DEBUG oslo_vmware.api [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Waiting for the task: (returnval){ [ 715.628320] env[59857]: value = "task-1341430" [ 715.628320] env[59857]: _type = "Task" [ 715.628320] env[59857]: } to complete. {{(pid=59857) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 715.637955] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e222cf8-2244-49d5-ae2a-a3a6705eaffd {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.644047] env[59857]: DEBUG oslo_vmware.api [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Task: {'id': task-1341430, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59857) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 715.653828] env[59857]: DEBUG nova.compute.provider_tree [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 715.667134] env[59857]: DEBUG nova.scheduler.client.report [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 715.685876] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.389s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 715.687137] env[59857]: ERROR nova.compute.manager [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 879190bd-d68a-455e-abd2-73e8f85c3e28, please check neutron logs for more information. [ 715.687137] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] Traceback (most recent call last): [ 715.687137] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 715.687137] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] self.driver.spawn(context, instance, image_meta, [ 715.687137] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 715.687137] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] self._vmops.spawn(context, instance, image_meta, injected_files, [ 715.687137] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 715.687137] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] vm_ref = self.build_virtual_machine(instance, [ 715.687137] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 715.687137] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] vif_infos = vmwarevif.get_vif_info(self._session, [ 715.687137] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 715.687569] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] for vif in network_info: [ 715.687569] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 715.687569] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] return self._sync_wrapper(fn, *args, **kwargs) [ 715.687569] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 715.687569] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] self.wait() [ 715.687569] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 715.687569] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] self[:] = self._gt.wait() [ 715.687569] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 715.687569] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] return self._exit_event.wait() [ 715.687569] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 715.687569] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] result = hub.switch() [ 715.687569] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 715.687569] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] return self.greenlet.switch() [ 715.687569] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 715.690953] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] result = function(*args, **kwargs) [ 715.690953] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 715.690953] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] return func(*args, **kwargs) [ 715.690953] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 715.690953] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] raise e [ 715.690953] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 715.690953] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] nwinfo = self.network_api.allocate_for_instance( [ 715.690953] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 715.690953] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] created_port_ids = self._update_ports_for_instance( [ 715.690953] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 715.690953] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] with excutils.save_and_reraise_exception(): [ 715.690953] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 715.690953] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] self.force_reraise() [ 715.692745] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 715.692745] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] raise self.value [ 715.692745] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 715.692745] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] updated_port = self._update_port( [ 715.692745] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 715.692745] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] _ensure_no_port_binding_failure(port) [ 715.692745] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 715.692745] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] raise exception.PortBindingFailed(port_id=port['id']) [ 715.692745] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] nova.exception.PortBindingFailed: Binding failed for port 879190bd-d68a-455e-abd2-73e8f85c3e28, please check neutron logs for more information. [ 715.692745] env[59857]: ERROR nova.compute.manager [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] [ 715.692745] env[59857]: DEBUG nova.compute.utils [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] Binding failed for port 879190bd-d68a-455e-abd2-73e8f85c3e28, please check neutron logs for more information. {{(pid=59857) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 715.693679] env[59857]: DEBUG nova.compute.manager [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] Build of instance 29c85bbf-553e-4b82-ad7c-5341ffc5af63 was re-scheduled: Binding failed for port 879190bd-d68a-455e-abd2-73e8f85c3e28, please check neutron logs for more information. {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 715.693679] env[59857]: DEBUG nova.compute.manager [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] Unplugging VIFs for instance {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 715.693679] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Acquiring lock "refresh_cache-29c85bbf-553e-4b82-ad7c-5341ffc5af63" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 715.693679] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Acquired lock "refresh_cache-29c85bbf-553e-4b82-ad7c-5341ffc5af63" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 715.694331] env[59857]: DEBUG nova.network.neutron [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 715.803699] env[59857]: DEBUG nova.network.neutron [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 716.074548] env[59857]: DEBUG nova.network.neutron [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 716.084287] env[59857]: DEBUG oslo_concurrency.lockutils [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Releasing lock "refresh_cache-5ad1fabc-bae4-47cb-9b27-42c86c4b02e8" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 716.084698] env[59857]: DEBUG nova.compute.manager [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] Start destroying the instance on the hypervisor. {{(pid=59857) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 716.084882] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] Destroying instance {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 716.085400] env[59857]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-fbc852f2-0842-4a8a-ad49-bdae3ef27ce5 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.095362] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ef0fcac-a286-460f-8fd2-a565c4cf4ae6 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.119791] env[59857]: WARNING nova.virt.vmwareapi.vmops [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8 could not be found. [ 716.120047] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] Instance destroyed {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 716.120230] env[59857]: INFO nova.compute.manager [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] Took 0.04 seconds to destroy the instance on the hypervisor. [ 716.120479] env[59857]: DEBUG oslo.service.loopingcall [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59857) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 716.120689] env[59857]: DEBUG nova.compute.manager [-] [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 716.120786] env[59857]: DEBUG nova.network.neutron [-] [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 716.141849] env[59857]: DEBUG oslo_vmware.api [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Task: {'id': task-1341430, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.036155} completed successfully. {{(pid=59857) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 716.142131] env[59857]: DEBUG nova.virt.vmwareapi.ds_util [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Deleted the datastore file {{(pid=59857) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 716.142316] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] Deleted contents of the VM from datastore datastore2 {{(pid=59857) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 716.142486] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] Instance destroyed {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 716.142653] env[59857]: INFO nova.compute.manager [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] Took 0.59 seconds to destroy the instance on the hypervisor. [ 716.142876] env[59857]: DEBUG oslo.service.loopingcall [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59857) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 716.143078] env[59857]: DEBUG nova.compute.manager [-] [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] Skipping network deallocation for instance since networking was not requested. {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 716.149508] env[59857]: DEBUG nova.compute.claims [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] Aborting claim: {{(pid=59857) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 716.149668] env[59857]: DEBUG oslo_concurrency.lockutils [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 716.149870] env[59857]: DEBUG oslo_concurrency.lockutils [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 716.224647] env[59857]: DEBUG nova.network.neutron [-] [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 716.237206] env[59857]: DEBUG nova.network.neutron [-] [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 716.250772] env[59857]: INFO nova.compute.manager [-] [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] Took 0.13 seconds to deallocate network for instance. [ 716.253009] env[59857]: DEBUG nova.compute.claims [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] Aborting claim: {{(pid=59857) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 716.253213] env[59857]: DEBUG oslo_concurrency.lockutils [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 716.333290] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6016e6b-1221-47ce-a154-f568d07b495a {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.341952] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c3ac931-3933-4a9c-8c87-d964fa6deb13 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.349065] env[59857]: DEBUG oslo_concurrency.lockutils [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Acquiring lock "72b94cae-d12d-4228-8ca0-20fde3095c38" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 716.349290] env[59857]: DEBUG oslo_concurrency.lockutils [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Lock "72b94cae-d12d-4228-8ca0-20fde3095c38" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 716.376428] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd849efe-f976-4e22-bfe4-4fd313636b6b {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.383756] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6215da0c-c361-42e8-8808-52dd791b8b65 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.397038] env[59857]: DEBUG nova.compute.provider_tree [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 716.410159] env[59857]: DEBUG nova.scheduler.client.report [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 716.428265] env[59857]: DEBUG oslo_concurrency.lockutils [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.278s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 716.428800] env[59857]: ERROR nova.compute.manager [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 716.428800] env[59857]: Faults: ['InvalidArgument'] [ 716.428800] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] Traceback (most recent call last): [ 716.428800] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 716.428800] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] self.driver.spawn(context, instance, image_meta, [ 716.428800] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 716.428800] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] self._vmops.spawn(context, instance, image_meta, injected_files, [ 716.428800] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 716.428800] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] self._fetch_image_if_missing(context, vi) [ 716.428800] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 716.428800] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] image_cache(vi, tmp_image_ds_loc) [ 716.428800] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 716.429192] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] vm_util.copy_virtual_disk( [ 716.429192] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 716.429192] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] session._wait_for_task(vmdk_copy_task) [ 716.429192] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 716.429192] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] return self.wait_for_task(task_ref) [ 716.429192] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 716.429192] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] return evt.wait() [ 716.429192] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 716.429192] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] result = hub.switch() [ 716.429192] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 716.429192] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] return self.greenlet.switch() [ 716.429192] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 716.429192] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] self.f(*self.args, **self.kw) [ 716.429526] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 716.429526] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] raise exceptions.translate_fault(task_info.error) [ 716.429526] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 716.429526] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] Faults: ['InvalidArgument'] [ 716.429526] env[59857]: ERROR nova.compute.manager [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] [ 716.429526] env[59857]: DEBUG nova.compute.utils [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] VimFaultException {{(pid=59857) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 716.430531] env[59857]: DEBUG oslo_concurrency.lockutils [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.177s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 716.433695] env[59857]: DEBUG nova.compute.manager [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] Build of instance b7ab8792-137d-4053-9df9-3d560aa5e411 was re-scheduled: A specified parameter was not correct: fileType [ 716.433695] env[59857]: Faults: ['InvalidArgument'] {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 716.434083] env[59857]: DEBUG nova.compute.manager [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] Unplugging VIFs for instance {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 716.434325] env[59857]: DEBUG oslo_concurrency.lockutils [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Acquiring lock "refresh_cache-b7ab8792-137d-4053-9df9-3d560aa5e411" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 716.434467] env[59857]: DEBUG oslo_concurrency.lockutils [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Acquired lock "refresh_cache-b7ab8792-137d-4053-9df9-3d560aa5e411" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 716.434623] env[59857]: DEBUG nova.network.neutron [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 716.562059] env[59857]: DEBUG nova.network.neutron [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 716.672508] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ecb3bc95-27b2-4d3a-8758-b8dcd440d8e4 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.680591] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc777656-a368-4177-bb28-2f0c7505fca0 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.726351] env[59857]: DEBUG nova.network.neutron [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 716.727566] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d928afc3-da32-4e65-8845-10ca031dfd54 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.736617] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8182cb95-a41c-4290-bcdd-533cac7a9a23 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.741596] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Releasing lock "refresh_cache-29c85bbf-553e-4b82-ad7c-5341ffc5af63" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 716.744017] env[59857]: DEBUG nova.compute.manager [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 716.744017] env[59857]: DEBUG nova.compute.manager [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 716.744017] env[59857]: DEBUG nova.network.neutron [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 716.756131] env[59857]: DEBUG nova.compute.provider_tree [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 716.768681] env[59857]: DEBUG nova.scheduler.client.report [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 716.784580] env[59857]: DEBUG oslo_concurrency.lockutils [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.354s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 716.785193] env[59857]: ERROR nova.compute.manager [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 301ba51c-e66d-4f6a-b589-40341a138ddf, please check neutron logs for more information. [ 716.785193] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] Traceback (most recent call last): [ 716.785193] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 716.785193] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] self.driver.spawn(context, instance, image_meta, [ 716.785193] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 716.785193] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 716.785193] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 716.785193] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] vm_ref = self.build_virtual_machine(instance, [ 716.785193] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 716.785193] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] vif_infos = vmwarevif.get_vif_info(self._session, [ 716.785193] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 716.785523] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] for vif in network_info: [ 716.785523] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 716.785523] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] return self._sync_wrapper(fn, *args, **kwargs) [ 716.785523] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 716.785523] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] self.wait() [ 716.785523] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 716.785523] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] self[:] = self._gt.wait() [ 716.785523] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 716.785523] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] return self._exit_event.wait() [ 716.785523] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 716.785523] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] result = hub.switch() [ 716.785523] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 716.785523] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] return self.greenlet.switch() [ 716.785523] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 716.785866] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] result = function(*args, **kwargs) [ 716.785866] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 716.785866] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] return func(*args, **kwargs) [ 716.785866] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 716.785866] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] raise e [ 716.785866] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 716.785866] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] nwinfo = self.network_api.allocate_for_instance( [ 716.785866] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 716.785866] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] created_port_ids = self._update_ports_for_instance( [ 716.785866] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 716.785866] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] with excutils.save_and_reraise_exception(): [ 716.785866] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 716.785866] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] self.force_reraise() [ 716.786184] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 716.786184] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] raise self.value [ 716.786184] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 716.786184] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] updated_port = self._update_port( [ 716.786184] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 716.786184] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] _ensure_no_port_binding_failure(port) [ 716.786184] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 716.786184] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] raise exception.PortBindingFailed(port_id=port['id']) [ 716.786184] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] nova.exception.PortBindingFailed: Binding failed for port 301ba51c-e66d-4f6a-b589-40341a138ddf, please check neutron logs for more information. [ 716.786184] env[59857]: ERROR nova.compute.manager [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] [ 716.786184] env[59857]: DEBUG nova.compute.utils [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] Binding failed for port 301ba51c-e66d-4f6a-b589-40341a138ddf, please check neutron logs for more information. {{(pid=59857) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 716.787329] env[59857]: DEBUG nova.compute.manager [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] Build of instance 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8 was re-scheduled: Binding failed for port 301ba51c-e66d-4f6a-b589-40341a138ddf, please check neutron logs for more information. {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 716.787727] env[59857]: DEBUG nova.compute.manager [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] Unplugging VIFs for instance {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 716.787944] env[59857]: DEBUG oslo_concurrency.lockutils [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "refresh_cache-5ad1fabc-bae4-47cb-9b27-42c86c4b02e8" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 716.788096] env[59857]: DEBUG oslo_concurrency.lockutils [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquired lock "refresh_cache-5ad1fabc-bae4-47cb-9b27-42c86c4b02e8" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 716.788248] env[59857]: DEBUG nova.network.neutron [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 716.837056] env[59857]: DEBUG nova.network.neutron [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 716.846120] env[59857]: DEBUG nova.network.neutron [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 716.856212] env[59857]: DEBUG nova.network.neutron [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 716.872119] env[59857]: INFO nova.compute.manager [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] Took 0.13 seconds to deallocate network for instance. [ 716.977913] env[59857]: INFO nova.scheduler.client.report [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Deleted allocations for instance 29c85bbf-553e-4b82-ad7c-5341ffc5af63 [ 716.996144] env[59857]: DEBUG nova.network.neutron [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 716.997935] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "29c85bbf-553e-4b82-ad7c-5341ffc5af63" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 10.972s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 717.010851] env[59857]: DEBUG nova.compute.manager [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] Starting instance... {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 717.035226] env[59857]: DEBUG oslo_concurrency.lockutils [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Releasing lock "refresh_cache-b7ab8792-137d-4053-9df9-3d560aa5e411" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 717.035452] env[59857]: DEBUG nova.compute.manager [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 717.035632] env[59857]: DEBUG nova.compute.manager [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] Skipping network deallocation for instance since networking was not requested. {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 717.080365] env[59857]: DEBUG oslo_concurrency.lockutils [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 717.080628] env[59857]: DEBUG oslo_concurrency.lockutils [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 717.082124] env[59857]: INFO nova.compute.claims [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 717.162798] env[59857]: INFO nova.scheduler.client.report [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Deleted allocations for instance b7ab8792-137d-4053-9df9-3d560aa5e411 [ 717.192870] env[59857]: DEBUG oslo_concurrency.lockutils [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Lock "b7ab8792-137d-4053-9df9-3d560aa5e411" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 98.408s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 717.309512] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c7c3690-ce80-4ffe-95d8-e3e5dc53514e {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.318212] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c445466-5026-4fea-9eb1-6acbb8b0348b {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.350864] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e39b6b8-abc6-4b21-99f3-94657f2c9705 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.354387] env[59857]: ERROR nova.compute.manager [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port ef85a30e-1521-4200-a4af-6db739713904, please check neutron logs for more information. [ 717.354387] env[59857]: ERROR nova.compute.manager Traceback (most recent call last): [ 717.354387] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 717.354387] env[59857]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 717.354387] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 717.354387] env[59857]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 717.354387] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 717.354387] env[59857]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 717.354387] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 717.354387] env[59857]: ERROR nova.compute.manager self.force_reraise() [ 717.354387] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 717.354387] env[59857]: ERROR nova.compute.manager raise self.value [ 717.354387] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 717.354387] env[59857]: ERROR nova.compute.manager updated_port = self._update_port( [ 717.354387] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 717.354387] env[59857]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 717.354813] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 717.354813] env[59857]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 717.354813] env[59857]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port ef85a30e-1521-4200-a4af-6db739713904, please check neutron logs for more information. [ 717.354813] env[59857]: ERROR nova.compute.manager [ 717.354813] env[59857]: Traceback (most recent call last): [ 717.354813] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 717.354813] env[59857]: listener.cb(fileno) [ 717.354813] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 717.354813] env[59857]: result = function(*args, **kwargs) [ 717.354813] env[59857]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 717.354813] env[59857]: return func(*args, **kwargs) [ 717.354813] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 717.354813] env[59857]: raise e [ 717.354813] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 717.354813] env[59857]: nwinfo = self.network_api.allocate_for_instance( [ 717.354813] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 717.354813] env[59857]: created_port_ids = self._update_ports_for_instance( [ 717.354813] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 717.354813] env[59857]: with excutils.save_and_reraise_exception(): [ 717.354813] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 717.354813] env[59857]: self.force_reraise() [ 717.354813] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 717.354813] env[59857]: raise self.value [ 717.354813] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 717.354813] env[59857]: updated_port = self._update_port( [ 717.354813] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 717.354813] env[59857]: _ensure_no_port_binding_failure(port) [ 717.354813] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 717.354813] env[59857]: raise exception.PortBindingFailed(port_id=port['id']) [ 717.355521] env[59857]: nova.exception.PortBindingFailed: Binding failed for port ef85a30e-1521-4200-a4af-6db739713904, please check neutron logs for more information. [ 717.355521] env[59857]: Removing descriptor: 17 [ 717.355521] env[59857]: ERROR nova.compute.manager [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port ef85a30e-1521-4200-a4af-6db739713904, please check neutron logs for more information. [ 717.355521] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] Traceback (most recent call last): [ 717.355521] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 717.355521] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] yield resources [ 717.355521] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 717.355521] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] self.driver.spawn(context, instance, image_meta, [ 717.355521] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 717.355521] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 717.355521] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 717.355521] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] vm_ref = self.build_virtual_machine(instance, [ 717.355834] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 717.355834] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] vif_infos = vmwarevif.get_vif_info(self._session, [ 717.355834] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 717.355834] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] for vif in network_info: [ 717.355834] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 717.355834] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] return self._sync_wrapper(fn, *args, **kwargs) [ 717.355834] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 717.355834] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] self.wait() [ 717.355834] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 717.355834] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] self[:] = self._gt.wait() [ 717.355834] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 717.355834] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] return self._exit_event.wait() [ 717.355834] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 717.355834] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] result = hub.switch() [ 717.356162] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 717.356162] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] return self.greenlet.switch() [ 717.356162] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 717.356162] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] result = function(*args, **kwargs) [ 717.356162] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 717.356162] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] return func(*args, **kwargs) [ 717.356162] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 717.356162] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] raise e [ 717.356162] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 717.356162] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] nwinfo = self.network_api.allocate_for_instance( [ 717.356162] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 717.356162] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] created_port_ids = self._update_ports_for_instance( [ 717.356162] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 717.356481] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] with excutils.save_and_reraise_exception(): [ 717.356481] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 717.356481] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] self.force_reraise() [ 717.356481] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 717.356481] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] raise self.value [ 717.356481] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 717.356481] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] updated_port = self._update_port( [ 717.356481] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 717.356481] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] _ensure_no_port_binding_failure(port) [ 717.356481] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 717.356481] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] raise exception.PortBindingFailed(port_id=port['id']) [ 717.356481] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] nova.exception.PortBindingFailed: Binding failed for port ef85a30e-1521-4200-a4af-6db739713904, please check neutron logs for more information. [ 717.356481] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] [ 717.356778] env[59857]: INFO nova.compute.manager [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] Terminating instance [ 717.357809] env[59857]: DEBUG oslo_concurrency.lockutils [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Acquiring lock "refresh_cache-6576c530-0b88-453e-bace-70a4f1c76d3c" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 717.357955] env[59857]: DEBUG oslo_concurrency.lockutils [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Acquired lock "refresh_cache-6576c530-0b88-453e-bace-70a4f1c76d3c" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 717.358132] env[59857]: DEBUG nova.network.neutron [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 717.362139] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e031776-c87f-42ba-a665-ef795a322095 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.378397] env[59857]: DEBUG nova.compute.provider_tree [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 717.380045] env[59857]: DEBUG nova.network.neutron [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] Successfully created port: 2cc98968-ca9f-4493-94eb-91a79482e1f8 {{(pid=59857) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 717.390042] env[59857]: DEBUG nova.scheduler.client.report [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 717.393494] env[59857]: DEBUG nova.network.neutron [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 717.404166] env[59857]: DEBUG oslo_concurrency.lockutils [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Releasing lock "refresh_cache-5ad1fabc-bae4-47cb-9b27-42c86c4b02e8" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 717.404519] env[59857]: DEBUG nova.compute.manager [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 717.404705] env[59857]: DEBUG nova.compute.manager [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 717.404866] env[59857]: DEBUG nova.network.neutron [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 717.407465] env[59857]: DEBUG oslo_concurrency.lockutils [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.327s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 717.407919] env[59857]: DEBUG nova.compute.manager [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] Start building networks asynchronously for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 717.425582] env[59857]: DEBUG nova.network.neutron [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 717.453429] env[59857]: DEBUG nova.compute.utils [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Using /dev/sd instead of None {{(pid=59857) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 717.454802] env[59857]: DEBUG nova.compute.manager [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] Allocating IP information in the background. {{(pid=59857) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 717.454968] env[59857]: DEBUG nova.network.neutron [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] allocate_for_instance() {{(pid=59857) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 717.463893] env[59857]: DEBUG nova.compute.manager [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] Start building block device mappings for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 717.482026] env[59857]: DEBUG nova.network.neutron [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 717.490683] env[59857]: DEBUG nova.network.neutron [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 717.501111] env[59857]: INFO nova.compute.manager [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] Took 0.10 seconds to deallocate network for instance. [ 717.526404] env[59857]: DEBUG nova.compute.manager [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] Start spawning the instance on the hypervisor. {{(pid=59857) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 717.556425] env[59857]: DEBUG nova.virt.hardware [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-23T14:43:14Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-23T14:42:57Z,direct_url=,disk_format='vmdk',id=4a4a4830-1ff7-4cff-ab75-d665942f46b5,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='88755ffc9d394d14928f0692771e61f5',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-23T14:42:58Z,virtual_size=,visibility=), allow threads: False {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 717.556665] env[59857]: DEBUG nova.virt.hardware [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Flavor limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 717.556812] env[59857]: DEBUG nova.virt.hardware [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Image limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 717.556981] env[59857]: DEBUG nova.virt.hardware [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Flavor pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 717.557132] env[59857]: DEBUG nova.virt.hardware [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Image pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 717.557278] env[59857]: DEBUG nova.virt.hardware [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 717.557497] env[59857]: DEBUG nova.virt.hardware [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 717.557650] env[59857]: DEBUG nova.virt.hardware [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 717.557815] env[59857]: DEBUG nova.virt.hardware [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Got 1 possible topologies {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 717.557969] env[59857]: DEBUG nova.virt.hardware [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 717.558147] env[59857]: DEBUG nova.virt.hardware [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 717.559199] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76d75c8a-4dd4-4497-bafa-d4694370c3a1 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.567475] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f5cf7c9-2c22-47b3-918e-fc3db3b226c1 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.600447] env[59857]: INFO nova.scheduler.client.report [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Deleted allocations for instance 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8 [ 717.616167] env[59857]: DEBUG oslo_concurrency.lockutils [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "5ad1fabc-bae4-47cb-9b27-42c86c4b02e8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 15.192s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 718.017817] env[59857]: DEBUG nova.policy [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '816a348b026b4041b8e75233830d4736', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b868df8009cd4d07b50856808eceb007', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59857) authorize /opt/stack/nova/nova/policy.py:203}} [ 718.145528] env[59857]: DEBUG nova.network.neutron [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 718.155351] env[59857]: DEBUG oslo_concurrency.lockutils [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Releasing lock "refresh_cache-6576c530-0b88-453e-bace-70a4f1c76d3c" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 718.155993] env[59857]: DEBUG nova.compute.manager [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] Start destroying the instance on the hypervisor. {{(pid=59857) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 718.156199] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] Destroying instance {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 718.156778] env[59857]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-fb6a7ef8-8d82-432b-b662-11fae445af4b {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.173763] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c3286d9-8941-44a8-8582-d581c47ae742 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.198828] env[59857]: WARNING nova.virt.vmwareapi.vmops [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 6576c530-0b88-453e-bace-70a4f1c76d3c could not be found. [ 718.199644] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] Instance destroyed {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 718.201624] env[59857]: INFO nova.compute.manager [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] Took 0.05 seconds to destroy the instance on the hypervisor. [ 718.201933] env[59857]: DEBUG oslo.service.loopingcall [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59857) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 718.202211] env[59857]: DEBUG nova.compute.manager [-] [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 718.202346] env[59857]: DEBUG nova.network.neutron [-] [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 718.287091] env[59857]: DEBUG nova.network.neutron [-] [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 718.294960] env[59857]: DEBUG nova.network.neutron [-] [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 718.303541] env[59857]: INFO nova.compute.manager [-] [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] Took 0.10 seconds to deallocate network for instance. [ 718.308399] env[59857]: DEBUG nova.compute.claims [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] Aborting claim: {{(pid=59857) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 718.308399] env[59857]: DEBUG oslo_concurrency.lockutils [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 718.308399] env[59857]: DEBUG oslo_concurrency.lockutils [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 718.464989] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-581ab1f7-7b50-4529-86d2-f4e9ebd8a7f2 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.473306] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9558d97e-8252-44ce-a63d-a4043117c445 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.508452] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-afee62da-21b5-47fb-a99e-4828d6d4589d {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.516187] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d24ab5f-afce-4867-b1b5-e3bf4fa33751 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.530045] env[59857]: DEBUG nova.compute.provider_tree [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 718.541019] env[59857]: DEBUG nova.scheduler.client.report [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 718.555166] env[59857]: DEBUG oslo_concurrency.lockutils [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.248s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 718.555970] env[59857]: ERROR nova.compute.manager [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port ef85a30e-1521-4200-a4af-6db739713904, please check neutron logs for more information. [ 718.555970] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] Traceback (most recent call last): [ 718.555970] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 718.555970] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] self.driver.spawn(context, instance, image_meta, [ 718.555970] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 718.555970] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 718.555970] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 718.555970] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] vm_ref = self.build_virtual_machine(instance, [ 718.555970] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 718.555970] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] vif_infos = vmwarevif.get_vif_info(self._session, [ 718.555970] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 718.556380] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] for vif in network_info: [ 718.556380] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 718.556380] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] return self._sync_wrapper(fn, *args, **kwargs) [ 718.556380] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 718.556380] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] self.wait() [ 718.556380] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 718.556380] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] self[:] = self._gt.wait() [ 718.556380] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 718.556380] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] return self._exit_event.wait() [ 718.556380] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 718.556380] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] result = hub.switch() [ 718.556380] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 718.556380] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] return self.greenlet.switch() [ 718.556380] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 718.556769] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] result = function(*args, **kwargs) [ 718.556769] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 718.556769] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] return func(*args, **kwargs) [ 718.556769] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 718.556769] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] raise e [ 718.556769] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 718.556769] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] nwinfo = self.network_api.allocate_for_instance( [ 718.556769] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 718.556769] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] created_port_ids = self._update_ports_for_instance( [ 718.556769] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 718.556769] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] with excutils.save_and_reraise_exception(): [ 718.556769] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 718.556769] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] self.force_reraise() [ 718.557134] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 718.557134] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] raise self.value [ 718.557134] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 718.557134] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] updated_port = self._update_port( [ 718.557134] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 718.557134] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] _ensure_no_port_binding_failure(port) [ 718.557134] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 718.557134] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] raise exception.PortBindingFailed(port_id=port['id']) [ 718.557134] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] nova.exception.PortBindingFailed: Binding failed for port ef85a30e-1521-4200-a4af-6db739713904, please check neutron logs for more information. [ 718.557134] env[59857]: ERROR nova.compute.manager [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] [ 718.558201] env[59857]: DEBUG nova.compute.utils [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] Binding failed for port ef85a30e-1521-4200-a4af-6db739713904, please check neutron logs for more information. {{(pid=59857) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 718.559741] env[59857]: DEBUG nova.compute.manager [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] Build of instance 6576c530-0b88-453e-bace-70a4f1c76d3c was re-scheduled: Binding failed for port ef85a30e-1521-4200-a4af-6db739713904, please check neutron logs for more information. {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 718.560319] env[59857]: DEBUG nova.compute.manager [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] Unplugging VIFs for instance {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 718.560646] env[59857]: DEBUG oslo_concurrency.lockutils [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Acquiring lock "refresh_cache-6576c530-0b88-453e-bace-70a4f1c76d3c" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 718.560907] env[59857]: DEBUG oslo_concurrency.lockutils [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Acquired lock "refresh_cache-6576c530-0b88-453e-bace-70a4f1c76d3c" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 718.561191] env[59857]: DEBUG nova.network.neutron [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 718.833682] env[59857]: DEBUG nova.network.neutron [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 719.343257] env[59857]: DEBUG nova.network.neutron [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 719.363429] env[59857]: DEBUG oslo_concurrency.lockutils [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Releasing lock "refresh_cache-6576c530-0b88-453e-bace-70a4f1c76d3c" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 719.363429] env[59857]: DEBUG nova.compute.manager [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 719.363429] env[59857]: DEBUG nova.compute.manager [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 719.363429] env[59857]: DEBUG nova.network.neutron [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 719.412852] env[59857]: DEBUG nova.network.neutron [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 719.426709] env[59857]: DEBUG nova.network.neutron [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 719.441554] env[59857]: INFO nova.compute.manager [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] Took 0.08 seconds to deallocate network for instance. [ 719.456237] env[59857]: ERROR nova.compute.manager [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 8c4b7ea2-a0f8-4f3d-a4c9-38b3e19ff1f8, please check neutron logs for more information. [ 719.456237] env[59857]: ERROR nova.compute.manager Traceback (most recent call last): [ 719.456237] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 719.456237] env[59857]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 719.456237] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 719.456237] env[59857]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 719.456237] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 719.456237] env[59857]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 719.456237] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 719.456237] env[59857]: ERROR nova.compute.manager self.force_reraise() [ 719.456237] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 719.456237] env[59857]: ERROR nova.compute.manager raise self.value [ 719.456237] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 719.456237] env[59857]: ERROR nova.compute.manager updated_port = self._update_port( [ 719.456237] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 719.456237] env[59857]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 719.456651] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 719.456651] env[59857]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 719.456651] env[59857]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 8c4b7ea2-a0f8-4f3d-a4c9-38b3e19ff1f8, please check neutron logs for more information. [ 719.456651] env[59857]: ERROR nova.compute.manager [ 719.456651] env[59857]: Traceback (most recent call last): [ 719.456651] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 719.456651] env[59857]: listener.cb(fileno) [ 719.456651] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 719.456651] env[59857]: result = function(*args, **kwargs) [ 719.456651] env[59857]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 719.456651] env[59857]: return func(*args, **kwargs) [ 719.456651] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 719.456651] env[59857]: raise e [ 719.456651] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 719.456651] env[59857]: nwinfo = self.network_api.allocate_for_instance( [ 719.456651] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 719.456651] env[59857]: created_port_ids = self._update_ports_for_instance( [ 719.456651] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 719.456651] env[59857]: with excutils.save_and_reraise_exception(): [ 719.456651] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 719.456651] env[59857]: self.force_reraise() [ 719.456651] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 719.456651] env[59857]: raise self.value [ 719.456651] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 719.456651] env[59857]: updated_port = self._update_port( [ 719.456651] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 719.456651] env[59857]: _ensure_no_port_binding_failure(port) [ 719.456651] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 719.456651] env[59857]: raise exception.PortBindingFailed(port_id=port['id']) [ 719.457334] env[59857]: nova.exception.PortBindingFailed: Binding failed for port 8c4b7ea2-a0f8-4f3d-a4c9-38b3e19ff1f8, please check neutron logs for more information. [ 719.457334] env[59857]: Removing descriptor: 19 [ 719.457334] env[59857]: ERROR nova.compute.manager [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 8c4b7ea2-a0f8-4f3d-a4c9-38b3e19ff1f8, please check neutron logs for more information. [ 719.457334] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] Traceback (most recent call last): [ 719.457334] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 719.457334] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] yield resources [ 719.457334] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 719.457334] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] self.driver.spawn(context, instance, image_meta, [ 719.457334] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 719.457334] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 719.457334] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 719.457334] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] vm_ref = self.build_virtual_machine(instance, [ 719.457606] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 719.457606] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] vif_infos = vmwarevif.get_vif_info(self._session, [ 719.457606] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 719.457606] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] for vif in network_info: [ 719.457606] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 719.457606] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] return self._sync_wrapper(fn, *args, **kwargs) [ 719.457606] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 719.457606] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] self.wait() [ 719.457606] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 719.457606] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] self[:] = self._gt.wait() [ 719.457606] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 719.457606] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] return self._exit_event.wait() [ 719.457606] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 719.457606] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] result = hub.switch() [ 719.457937] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 719.457937] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] return self.greenlet.switch() [ 719.457937] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 719.457937] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] result = function(*args, **kwargs) [ 719.457937] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 719.457937] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] return func(*args, **kwargs) [ 719.457937] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 719.457937] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] raise e [ 719.457937] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 719.457937] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] nwinfo = self.network_api.allocate_for_instance( [ 719.457937] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 719.457937] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] created_port_ids = self._update_ports_for_instance( [ 719.457937] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 719.458579] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] with excutils.save_and_reraise_exception(): [ 719.458579] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 719.458579] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] self.force_reraise() [ 719.458579] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 719.458579] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] raise self.value [ 719.458579] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 719.458579] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] updated_port = self._update_port( [ 719.458579] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 719.458579] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] _ensure_no_port_binding_failure(port) [ 719.458579] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 719.458579] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] raise exception.PortBindingFailed(port_id=port['id']) [ 719.458579] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] nova.exception.PortBindingFailed: Binding failed for port 8c4b7ea2-a0f8-4f3d-a4c9-38b3e19ff1f8, please check neutron logs for more information. [ 719.458579] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] [ 719.458889] env[59857]: INFO nova.compute.manager [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] Terminating instance [ 719.461736] env[59857]: DEBUG oslo_concurrency.lockutils [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Acquiring lock "refresh_cache-db69e0d0-724b-4a87-80f5-390cfc395ee9" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 719.461736] env[59857]: DEBUG oslo_concurrency.lockutils [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Acquired lock "refresh_cache-db69e0d0-724b-4a87-80f5-390cfc395ee9" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 719.461926] env[59857]: DEBUG nova.network.neutron [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 719.548904] env[59857]: DEBUG nova.network.neutron [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 719.611519] env[59857]: INFO nova.scheduler.client.report [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Deleted allocations for instance 6576c530-0b88-453e-bace-70a4f1c76d3c [ 719.634156] env[59857]: DEBUG oslo_concurrency.lockutils [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Lock "6576c530-0b88-453e-bace-70a4f1c76d3c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 10.137s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 720.333702] env[59857]: DEBUG nova.network.neutron [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 720.344701] env[59857]: DEBUG nova.network.neutron [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] Successfully created port: cea10054-d802-4762-920a-926732dcdf98 {{(pid=59857) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 720.358280] env[59857]: DEBUG oslo_concurrency.lockutils [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Releasing lock "refresh_cache-db69e0d0-724b-4a87-80f5-390cfc395ee9" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 720.358745] env[59857]: DEBUG nova.compute.manager [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] Start destroying the instance on the hypervisor. {{(pid=59857) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 720.358952] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] Destroying instance {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 720.359493] env[59857]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-fa3ccf42-045c-4b4e-9e70-a2f68c7dbe39 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.373819] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e344f4f-8de0-465c-bf7a-367acbd8c8da {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.401026] env[59857]: WARNING nova.virt.vmwareapi.vmops [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance db69e0d0-724b-4a87-80f5-390cfc395ee9 could not be found. [ 720.401026] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] Instance destroyed {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 720.401026] env[59857]: INFO nova.compute.manager [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] Took 0.04 seconds to destroy the instance on the hypervisor. [ 720.401026] env[59857]: DEBUG oslo.service.loopingcall [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59857) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 720.401026] env[59857]: DEBUG nova.compute.manager [-] [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 720.401597] env[59857]: DEBUG nova.network.neutron [-] [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 720.476835] env[59857]: DEBUG nova.network.neutron [-] [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 720.483050] env[59857]: DEBUG nova.network.neutron [-] [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 720.499599] env[59857]: INFO nova.compute.manager [-] [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] Took 0.10 seconds to deallocate network for instance. [ 720.503185] env[59857]: DEBUG nova.compute.claims [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] Aborting claim: {{(pid=59857) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 720.503185] env[59857]: DEBUG oslo_concurrency.lockutils [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 720.503185] env[59857]: DEBUG oslo_concurrency.lockutils [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 720.681676] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-30eb1223-3769-453b-b4de-ff67f96df3e5 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.693034] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef5d3c1e-4188-4452-a2dc-d1d9397bb707 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.727121] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6be844c-8bf3-4e2f-9eb6-c7c94bc2b71a {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.736984] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a1cf92c-04a9-4d3d-bc7a-99e9353df42b {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.750877] env[59857]: DEBUG nova.compute.provider_tree [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 720.761842] env[59857]: DEBUG nova.scheduler.client.report [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 720.783127] env[59857]: DEBUG oslo_concurrency.lockutils [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.280s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 720.783673] env[59857]: ERROR nova.compute.manager [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 8c4b7ea2-a0f8-4f3d-a4c9-38b3e19ff1f8, please check neutron logs for more information. [ 720.783673] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] Traceback (most recent call last): [ 720.783673] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 720.783673] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] self.driver.spawn(context, instance, image_meta, [ 720.783673] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 720.783673] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 720.783673] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 720.783673] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] vm_ref = self.build_virtual_machine(instance, [ 720.783673] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 720.783673] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] vif_infos = vmwarevif.get_vif_info(self._session, [ 720.783673] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 720.784179] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] for vif in network_info: [ 720.784179] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 720.784179] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] return self._sync_wrapper(fn, *args, **kwargs) [ 720.784179] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 720.784179] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] self.wait() [ 720.784179] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 720.784179] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] self[:] = self._gt.wait() [ 720.784179] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 720.784179] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] return self._exit_event.wait() [ 720.784179] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 720.784179] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] result = hub.switch() [ 720.784179] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 720.784179] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] return self.greenlet.switch() [ 720.784179] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 720.784562] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] result = function(*args, **kwargs) [ 720.784562] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 720.784562] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] return func(*args, **kwargs) [ 720.784562] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 720.784562] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] raise e [ 720.784562] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 720.784562] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] nwinfo = self.network_api.allocate_for_instance( [ 720.784562] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 720.784562] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] created_port_ids = self._update_ports_for_instance( [ 720.784562] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 720.784562] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] with excutils.save_and_reraise_exception(): [ 720.784562] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 720.784562] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] self.force_reraise() [ 720.785762] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 720.785762] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] raise self.value [ 720.785762] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 720.785762] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] updated_port = self._update_port( [ 720.785762] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 720.785762] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] _ensure_no_port_binding_failure(port) [ 720.785762] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 720.785762] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] raise exception.PortBindingFailed(port_id=port['id']) [ 720.785762] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] nova.exception.PortBindingFailed: Binding failed for port 8c4b7ea2-a0f8-4f3d-a4c9-38b3e19ff1f8, please check neutron logs for more information. [ 720.785762] env[59857]: ERROR nova.compute.manager [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] [ 720.785762] env[59857]: DEBUG nova.compute.utils [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] Binding failed for port 8c4b7ea2-a0f8-4f3d-a4c9-38b3e19ff1f8, please check neutron logs for more information. {{(pid=59857) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 720.786890] env[59857]: DEBUG nova.compute.manager [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] Build of instance db69e0d0-724b-4a87-80f5-390cfc395ee9 was re-scheduled: Binding failed for port 8c4b7ea2-a0f8-4f3d-a4c9-38b3e19ff1f8, please check neutron logs for more information. {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 720.787563] env[59857]: DEBUG nova.compute.manager [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] Unplugging VIFs for instance {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 720.787910] env[59857]: DEBUG oslo_concurrency.lockutils [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Acquiring lock "refresh_cache-db69e0d0-724b-4a87-80f5-390cfc395ee9" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 720.788177] env[59857]: DEBUG oslo_concurrency.lockutils [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Acquired lock "refresh_cache-db69e0d0-724b-4a87-80f5-390cfc395ee9" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 720.788701] env[59857]: DEBUG nova.network.neutron [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 720.875077] env[59857]: DEBUG nova.network.neutron [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 721.336929] env[59857]: ERROR nova.compute.manager [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port b475fa57-76c3-4f1c-a1bf-fd6c13cd4193, please check neutron logs for more information. [ 721.336929] env[59857]: ERROR nova.compute.manager Traceback (most recent call last): [ 721.336929] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 721.336929] env[59857]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 721.336929] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 721.336929] env[59857]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 721.336929] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 721.336929] env[59857]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 721.336929] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 721.336929] env[59857]: ERROR nova.compute.manager self.force_reraise() [ 721.336929] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 721.336929] env[59857]: ERROR nova.compute.manager raise self.value [ 721.336929] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 721.336929] env[59857]: ERROR nova.compute.manager updated_port = self._update_port( [ 721.336929] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 721.336929] env[59857]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 721.337872] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 721.337872] env[59857]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 721.337872] env[59857]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port b475fa57-76c3-4f1c-a1bf-fd6c13cd4193, please check neutron logs for more information. [ 721.337872] env[59857]: ERROR nova.compute.manager [ 721.337872] env[59857]: Traceback (most recent call last): [ 721.337872] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 721.337872] env[59857]: listener.cb(fileno) [ 721.337872] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 721.337872] env[59857]: result = function(*args, **kwargs) [ 721.337872] env[59857]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 721.337872] env[59857]: return func(*args, **kwargs) [ 721.337872] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 721.337872] env[59857]: raise e [ 721.337872] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 721.337872] env[59857]: nwinfo = self.network_api.allocate_for_instance( [ 721.337872] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 721.337872] env[59857]: created_port_ids = self._update_ports_for_instance( [ 721.337872] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 721.337872] env[59857]: with excutils.save_and_reraise_exception(): [ 721.337872] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 721.337872] env[59857]: self.force_reraise() [ 721.337872] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 721.337872] env[59857]: raise self.value [ 721.337872] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 721.337872] env[59857]: updated_port = self._update_port( [ 721.337872] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 721.337872] env[59857]: _ensure_no_port_binding_failure(port) [ 721.337872] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 721.337872] env[59857]: raise exception.PortBindingFailed(port_id=port['id']) [ 721.339407] env[59857]: nova.exception.PortBindingFailed: Binding failed for port b475fa57-76c3-4f1c-a1bf-fd6c13cd4193, please check neutron logs for more information. [ 721.339407] env[59857]: Removing descriptor: 21 [ 721.339407] env[59857]: ERROR nova.compute.manager [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port b475fa57-76c3-4f1c-a1bf-fd6c13cd4193, please check neutron logs for more information. [ 721.339407] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] Traceback (most recent call last): [ 721.339407] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 721.339407] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] yield resources [ 721.339407] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 721.339407] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] self.driver.spawn(context, instance, image_meta, [ 721.339407] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 721.339407] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 721.339407] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 721.339407] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] vm_ref = self.build_virtual_machine(instance, [ 721.340617] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 721.340617] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] vif_infos = vmwarevif.get_vif_info(self._session, [ 721.340617] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 721.340617] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] for vif in network_info: [ 721.340617] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 721.340617] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] return self._sync_wrapper(fn, *args, **kwargs) [ 721.340617] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 721.340617] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] self.wait() [ 721.340617] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 721.340617] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] self[:] = self._gt.wait() [ 721.340617] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 721.340617] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] return self._exit_event.wait() [ 721.340617] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 721.340617] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] result = hub.switch() [ 721.341947] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 721.341947] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] return self.greenlet.switch() [ 721.341947] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 721.341947] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] result = function(*args, **kwargs) [ 721.341947] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 721.341947] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] return func(*args, **kwargs) [ 721.341947] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 721.341947] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] raise e [ 721.341947] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 721.341947] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] nwinfo = self.network_api.allocate_for_instance( [ 721.341947] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 721.341947] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] created_port_ids = self._update_ports_for_instance( [ 721.341947] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 721.342273] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] with excutils.save_and_reraise_exception(): [ 721.342273] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 721.342273] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] self.force_reraise() [ 721.342273] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 721.342273] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] raise self.value [ 721.342273] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 721.342273] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] updated_port = self._update_port( [ 721.342273] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 721.342273] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] _ensure_no_port_binding_failure(port) [ 721.342273] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 721.342273] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] raise exception.PortBindingFailed(port_id=port['id']) [ 721.342273] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] nova.exception.PortBindingFailed: Binding failed for port b475fa57-76c3-4f1c-a1bf-fd6c13cd4193, please check neutron logs for more information. [ 721.342273] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] [ 721.342569] env[59857]: INFO nova.compute.manager [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] Terminating instance [ 721.342569] env[59857]: DEBUG oslo_concurrency.lockutils [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Acquiring lock "refresh_cache-40115f76-28d8-4f39-9dca-59401f52f22f" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 721.342569] env[59857]: DEBUG oslo_concurrency.lockutils [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Acquired lock "refresh_cache-40115f76-28d8-4f39-9dca-59401f52f22f" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 721.342569] env[59857]: DEBUG nova.network.neutron [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 721.474841] env[59857]: DEBUG nova.network.neutron [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 721.857259] env[59857]: DEBUG nova.network.neutron [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 721.868371] env[59857]: DEBUG oslo_concurrency.lockutils [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Releasing lock "refresh_cache-db69e0d0-724b-4a87-80f5-390cfc395ee9" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 721.868592] env[59857]: DEBUG nova.compute.manager [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 721.868768] env[59857]: DEBUG nova.compute.manager [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 721.868935] env[59857]: DEBUG nova.network.neutron [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 721.880264] env[59857]: DEBUG nova.network.neutron [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 721.889654] env[59857]: DEBUG oslo_concurrency.lockutils [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Releasing lock "refresh_cache-40115f76-28d8-4f39-9dca-59401f52f22f" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 721.890519] env[59857]: DEBUG nova.compute.manager [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] Start destroying the instance on the hypervisor. {{(pid=59857) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 721.890519] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] Destroying instance {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 721.890722] env[59857]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-3d8ab132-f919-403e-b855-e32f99b8c208 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 721.901422] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-605ae321-cdfe-4aaa-a1a8-2191e202508a {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 721.932907] env[59857]: WARNING nova.virt.vmwareapi.vmops [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 40115f76-28d8-4f39-9dca-59401f52f22f could not be found. [ 721.933149] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] Instance destroyed {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 721.933324] env[59857]: INFO nova.compute.manager [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] Took 0.04 seconds to destroy the instance on the hypervisor. [ 721.933558] env[59857]: DEBUG oslo.service.loopingcall [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59857) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 721.933753] env[59857]: DEBUG nova.compute.manager [-] [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 721.933865] env[59857]: DEBUG nova.network.neutron [-] [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 721.954756] env[59857]: DEBUG nova.network.neutron [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 721.964250] env[59857]: DEBUG nova.network.neutron [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 721.976333] env[59857]: INFO nova.compute.manager [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] Took 0.11 seconds to deallocate network for instance. [ 722.003740] env[59857]: DEBUG nova.network.neutron [-] [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 722.015630] env[59857]: DEBUG nova.network.neutron [-] [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 722.024770] env[59857]: INFO nova.compute.manager [-] [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] Took 0.09 seconds to deallocate network for instance. [ 722.026725] env[59857]: DEBUG nova.compute.claims [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] Aborting claim: {{(pid=59857) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 722.027068] env[59857]: DEBUG oslo_concurrency.lockutils [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 722.027362] env[59857]: DEBUG oslo_concurrency.lockutils [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 722.084033] env[59857]: INFO nova.scheduler.client.report [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Deleted allocations for instance db69e0d0-724b-4a87-80f5-390cfc395ee9 [ 722.106041] env[59857]: DEBUG oslo_concurrency.lockutils [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Lock "db69e0d0-724b-4a87-80f5-390cfc395ee9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 16.457s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 722.190178] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-919ab544-fbea-4415-aa68-c15db532d319 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 722.199505] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86a2ef60-0382-49e8-a72f-3d3caefb0fbe {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 722.232121] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ee303bc-2130-47c2-8b07-6ff292e21c56 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 722.239929] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6742313-9081-4f17-8323-5c9425414975 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 722.253603] env[59857]: DEBUG nova.compute.provider_tree [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 722.262473] env[59857]: DEBUG nova.scheduler.client.report [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 722.277478] env[59857]: DEBUG oslo_concurrency.lockutils [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.250s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 722.278127] env[59857]: ERROR nova.compute.manager [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port b475fa57-76c3-4f1c-a1bf-fd6c13cd4193, please check neutron logs for more information. [ 722.278127] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] Traceback (most recent call last): [ 722.278127] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 722.278127] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] self.driver.spawn(context, instance, image_meta, [ 722.278127] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 722.278127] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 722.278127] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 722.278127] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] vm_ref = self.build_virtual_machine(instance, [ 722.278127] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 722.278127] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] vif_infos = vmwarevif.get_vif_info(self._session, [ 722.278127] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 722.278440] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] for vif in network_info: [ 722.278440] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 722.278440] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] return self._sync_wrapper(fn, *args, **kwargs) [ 722.278440] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 722.278440] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] self.wait() [ 722.278440] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 722.278440] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] self[:] = self._gt.wait() [ 722.278440] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 722.278440] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] return self._exit_event.wait() [ 722.278440] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 722.278440] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] result = hub.switch() [ 722.278440] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 722.278440] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] return self.greenlet.switch() [ 722.278440] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 722.278876] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] result = function(*args, **kwargs) [ 722.278876] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 722.278876] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] return func(*args, **kwargs) [ 722.278876] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 722.278876] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] raise e [ 722.278876] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 722.278876] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] nwinfo = self.network_api.allocate_for_instance( [ 722.278876] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 722.278876] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] created_port_ids = self._update_ports_for_instance( [ 722.278876] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 722.278876] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] with excutils.save_and_reraise_exception(): [ 722.278876] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 722.278876] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] self.force_reraise() [ 722.279181] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 722.279181] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] raise self.value [ 722.279181] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 722.279181] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] updated_port = self._update_port( [ 722.279181] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 722.279181] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] _ensure_no_port_binding_failure(port) [ 722.279181] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 722.279181] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] raise exception.PortBindingFailed(port_id=port['id']) [ 722.279181] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] nova.exception.PortBindingFailed: Binding failed for port b475fa57-76c3-4f1c-a1bf-fd6c13cd4193, please check neutron logs for more information. [ 722.279181] env[59857]: ERROR nova.compute.manager [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] [ 722.279181] env[59857]: DEBUG nova.compute.utils [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] Binding failed for port b475fa57-76c3-4f1c-a1bf-fd6c13cd4193, please check neutron logs for more information. {{(pid=59857) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 722.280381] env[59857]: DEBUG nova.compute.manager [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] Build of instance 40115f76-28d8-4f39-9dca-59401f52f22f was re-scheduled: Binding failed for port b475fa57-76c3-4f1c-a1bf-fd6c13cd4193, please check neutron logs for more information. {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 722.280785] env[59857]: DEBUG nova.compute.manager [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] Unplugging VIFs for instance {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 722.281033] env[59857]: DEBUG oslo_concurrency.lockutils [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Acquiring lock "refresh_cache-40115f76-28d8-4f39-9dca-59401f52f22f" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 722.281161] env[59857]: DEBUG oslo_concurrency.lockutils [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Acquired lock "refresh_cache-40115f76-28d8-4f39-9dca-59401f52f22f" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 722.281313] env[59857]: DEBUG nova.network.neutron [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 722.477813] env[59857]: DEBUG nova.network.neutron [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 722.831898] env[59857]: DEBUG nova.network.neutron [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 722.840594] env[59857]: DEBUG oslo_concurrency.lockutils [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Releasing lock "refresh_cache-40115f76-28d8-4f39-9dca-59401f52f22f" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 722.840813] env[59857]: DEBUG nova.compute.manager [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 722.840970] env[59857]: DEBUG nova.compute.manager [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 722.841150] env[59857]: DEBUG nova.network.neutron [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 722.923806] env[59857]: DEBUG nova.network.neutron [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 722.932804] env[59857]: DEBUG nova.network.neutron [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 722.942426] env[59857]: INFO nova.compute.manager [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] Took 0.10 seconds to deallocate network for instance. [ 723.036227] env[59857]: INFO nova.scheduler.client.report [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Deleted allocations for instance 40115f76-28d8-4f39-9dca-59401f52f22f [ 723.058226] env[59857]: DEBUG oslo_concurrency.lockutils [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Lock "40115f76-28d8-4f39-9dca-59401f52f22f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 16.207s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 724.038273] env[59857]: DEBUG oslo_concurrency.lockutils [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "4e0befc8-76e7-484d-957e-55b0aaedc2c6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 724.038581] env[59857]: DEBUG oslo_concurrency.lockutils [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "4e0befc8-76e7-484d-957e-55b0aaedc2c6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 724.050430] env[59857]: DEBUG nova.compute.manager [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] Starting instance... {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 724.118985] env[59857]: DEBUG oslo_concurrency.lockutils [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 724.119265] env[59857]: DEBUG oslo_concurrency.lockutils [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 724.120845] env[59857]: INFO nova.compute.claims [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 724.281559] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a657a79-49c7-4892-bac0-c2119b2deabd {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 724.292420] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b277464-26a9-4707-8fe4-a9c596d4f959 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 724.338573] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-129910d6-0d9e-4635-acac-d3fe06149606 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 724.346907] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e5dddb8a-73a7-4ef0-8727-1233e2391593 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 724.368109] env[59857]: DEBUG nova.compute.provider_tree [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 724.377094] env[59857]: DEBUG nova.scheduler.client.report [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 724.393883] env[59857]: DEBUG oslo_concurrency.lockutils [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.274s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 724.393964] env[59857]: DEBUG nova.compute.manager [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] Start building networks asynchronously for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 724.438100] env[59857]: DEBUG nova.compute.utils [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Using /dev/sd instead of None {{(pid=59857) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 724.441371] env[59857]: DEBUG nova.compute.manager [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] Allocating IP information in the background. {{(pid=59857) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 724.441590] env[59857]: DEBUG nova.network.neutron [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] allocate_for_instance() {{(pid=59857) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 724.450400] env[59857]: DEBUG nova.compute.manager [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] Start building block device mappings for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 724.527538] env[59857]: DEBUG nova.compute.manager [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] Start spawning the instance on the hypervisor. {{(pid=59857) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 724.549841] env[59857]: DEBUG nova.virt.hardware [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-23T14:43:14Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-23T14:42:57Z,direct_url=,disk_format='vmdk',id=4a4a4830-1ff7-4cff-ab75-d665942f46b5,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='88755ffc9d394d14928f0692771e61f5',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-23T14:42:58Z,virtual_size=,visibility=), allow threads: False {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 724.550113] env[59857]: DEBUG nova.virt.hardware [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Flavor limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 724.550290] env[59857]: DEBUG nova.virt.hardware [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Image limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 724.550541] env[59857]: DEBUG nova.virt.hardware [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Flavor pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 724.550627] env[59857]: DEBUG nova.virt.hardware [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Image pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 724.550778] env[59857]: DEBUG nova.virt.hardware [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 724.551040] env[59857]: DEBUG nova.virt.hardware [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 724.551220] env[59857]: DEBUG nova.virt.hardware [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 724.551418] env[59857]: DEBUG nova.virt.hardware [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Got 1 possible topologies {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 724.551602] env[59857]: DEBUG nova.virt.hardware [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 724.551784] env[59857]: DEBUG nova.virt.hardware [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 724.552904] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-abca362a-cc8d-47ea-a796-a2fefcdcb697 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 724.561845] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d20d9771-4220-4d61-a406-66f61094ecce {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 724.585545] env[59857]: DEBUG oslo_concurrency.lockutils [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Acquiring lock "8d391d2c-ea85-47d4-a140-03ea6da1c101" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 724.585764] env[59857]: DEBUG oslo_concurrency.lockutils [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "8d391d2c-ea85-47d4-a140-03ea6da1c101" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 724.601429] env[59857]: DEBUG nova.compute.manager [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] Starting instance... {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 724.695420] env[59857]: DEBUG oslo_concurrency.lockutils [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 724.695605] env[59857]: DEBUG oslo_concurrency.lockutils [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 724.697095] env[59857]: INFO nova.compute.claims [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 724.893451] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9fe0ba86-4597-4229-9246-82aee2c6f612 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 724.909705] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67fd2825-7eef-43e0-a4f1-6573d25814d2 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 724.949603] env[59857]: ERROR nova.compute.manager [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 78685b50-e23c-45ff-8b69-8ac77b285c14, please check neutron logs for more information. [ 724.949603] env[59857]: ERROR nova.compute.manager Traceback (most recent call last): [ 724.949603] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 724.949603] env[59857]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 724.949603] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 724.949603] env[59857]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 724.949603] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 724.949603] env[59857]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 724.949603] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 724.949603] env[59857]: ERROR nova.compute.manager self.force_reraise() [ 724.949603] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 724.949603] env[59857]: ERROR nova.compute.manager raise self.value [ 724.949603] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 724.949603] env[59857]: ERROR nova.compute.manager updated_port = self._update_port( [ 724.949603] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 724.949603] env[59857]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 724.950117] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 724.950117] env[59857]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 724.950117] env[59857]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 78685b50-e23c-45ff-8b69-8ac77b285c14, please check neutron logs for more information. [ 724.950117] env[59857]: ERROR nova.compute.manager [ 724.950117] env[59857]: Traceback (most recent call last): [ 724.950117] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 724.950117] env[59857]: listener.cb(fileno) [ 724.950117] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 724.950117] env[59857]: result = function(*args, **kwargs) [ 724.950117] env[59857]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 724.950117] env[59857]: return func(*args, **kwargs) [ 724.950117] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 724.950117] env[59857]: raise e [ 724.950117] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 724.950117] env[59857]: nwinfo = self.network_api.allocate_for_instance( [ 724.950117] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 724.950117] env[59857]: created_port_ids = self._update_ports_for_instance( [ 724.950117] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 724.950117] env[59857]: with excutils.save_and_reraise_exception(): [ 724.950117] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 724.950117] env[59857]: self.force_reraise() [ 724.950117] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 724.950117] env[59857]: raise self.value [ 724.950117] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 724.950117] env[59857]: updated_port = self._update_port( [ 724.950117] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 724.950117] env[59857]: _ensure_no_port_binding_failure(port) [ 724.950117] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 724.950117] env[59857]: raise exception.PortBindingFailed(port_id=port['id']) [ 724.950843] env[59857]: nova.exception.PortBindingFailed: Binding failed for port 78685b50-e23c-45ff-8b69-8ac77b285c14, please check neutron logs for more information. [ 724.950843] env[59857]: Removing descriptor: 14 [ 724.950843] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8a0f630-34d8-4dbf-a842-88f7753dffde {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 724.953742] env[59857]: ERROR nova.compute.manager [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 78685b50-e23c-45ff-8b69-8ac77b285c14, please check neutron logs for more information. [ 724.953742] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] Traceback (most recent call last): [ 724.953742] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 724.953742] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] yield resources [ 724.953742] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 724.953742] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] self.driver.spawn(context, instance, image_meta, [ 724.953742] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 724.953742] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] self._vmops.spawn(context, instance, image_meta, injected_files, [ 724.953742] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 724.953742] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] vm_ref = self.build_virtual_machine(instance, [ 724.953742] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 724.954090] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] vif_infos = vmwarevif.get_vif_info(self._session, [ 724.954090] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 724.954090] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] for vif in network_info: [ 724.954090] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 724.954090] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] return self._sync_wrapper(fn, *args, **kwargs) [ 724.954090] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 724.954090] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] self.wait() [ 724.954090] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 724.954090] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] self[:] = self._gt.wait() [ 724.954090] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 724.954090] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] return self._exit_event.wait() [ 724.954090] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 724.954090] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] result = hub.switch() [ 724.954090] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 724.954481] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] return self.greenlet.switch() [ 724.954481] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 724.954481] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] result = function(*args, **kwargs) [ 724.954481] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 724.954481] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] return func(*args, **kwargs) [ 724.954481] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 724.954481] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] raise e [ 724.954481] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 724.954481] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] nwinfo = self.network_api.allocate_for_instance( [ 724.954481] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 724.954481] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] created_port_ids = self._update_ports_for_instance( [ 724.954481] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 724.954481] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] with excutils.save_and_reraise_exception(): [ 724.954824] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 724.954824] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] self.force_reraise() [ 724.954824] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 724.954824] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] raise self.value [ 724.954824] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 724.954824] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] updated_port = self._update_port( [ 724.954824] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 724.954824] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] _ensure_no_port_binding_failure(port) [ 724.954824] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 724.954824] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] raise exception.PortBindingFailed(port_id=port['id']) [ 724.954824] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] nova.exception.PortBindingFailed: Binding failed for port 78685b50-e23c-45ff-8b69-8ac77b285c14, please check neutron logs for more information. [ 724.954824] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] [ 724.955174] env[59857]: INFO nova.compute.manager [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] Terminating instance [ 724.956253] env[59857]: DEBUG oslo_concurrency.lockutils [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Acquiring lock "refresh_cache-0fc5de88-13a6-498a-848c-35beb772be65" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 724.956680] env[59857]: DEBUG oslo_concurrency.lockutils [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Acquired lock "refresh_cache-0fc5de88-13a6-498a-848c-35beb772be65" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 724.956680] env[59857]: DEBUG nova.network.neutron [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 724.960827] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9c8ecc1-42e7-4a81-9cd6-f660e648949e {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 724.983556] env[59857]: DEBUG nova.compute.provider_tree [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 724.996184] env[59857]: DEBUG nova.scheduler.client.report [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 725.010375] env[59857]: DEBUG nova.policy [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9ae0c3fdf5814c20819e4329e87733e3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd742fb05f93f44a9b9c8207f47e77730', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59857) authorize /opt/stack/nova/nova/policy.py:203}} [ 725.014191] env[59857]: DEBUG oslo_concurrency.lockutils [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.318s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 725.014678] env[59857]: DEBUG nova.compute.manager [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] Start building networks asynchronously for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 725.060158] env[59857]: DEBUG nova.compute.utils [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Using /dev/sd instead of None {{(pid=59857) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 725.063977] env[59857]: DEBUG nova.compute.manager [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] Allocating IP information in the background. {{(pid=59857) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 725.063977] env[59857]: DEBUG nova.network.neutron [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] allocate_for_instance() {{(pid=59857) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 725.077054] env[59857]: DEBUG nova.compute.manager [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] Start building block device mappings for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 725.082048] env[59857]: DEBUG nova.network.neutron [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 725.165290] env[59857]: DEBUG nova.compute.manager [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] Start spawning the instance on the hypervisor. {{(pid=59857) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 725.197740] env[59857]: DEBUG nova.virt.hardware [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-23T14:43:14Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-23T14:42:57Z,direct_url=,disk_format='vmdk',id=4a4a4830-1ff7-4cff-ab75-d665942f46b5,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='88755ffc9d394d14928f0692771e61f5',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-23T14:42:58Z,virtual_size=,visibility=), allow threads: False {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 725.197740] env[59857]: DEBUG nova.virt.hardware [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Flavor limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 725.197919] env[59857]: DEBUG nova.virt.hardware [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Image limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 725.198739] env[59857]: DEBUG nova.virt.hardware [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Flavor pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 725.198739] env[59857]: DEBUG nova.virt.hardware [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Image pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 725.198739] env[59857]: DEBUG nova.virt.hardware [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 725.198739] env[59857]: DEBUG nova.virt.hardware [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 725.198739] env[59857]: DEBUG nova.virt.hardware [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 725.199099] env[59857]: DEBUG nova.virt.hardware [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Got 1 possible topologies {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 725.199099] env[59857]: DEBUG nova.virt.hardware [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 725.199514] env[59857]: DEBUG nova.virt.hardware [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 725.200080] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b24c2108-b54d-4a15-ad9e-232a6d7f86f8 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 725.210120] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f27602f6-f4b8-40c5-b334-6cb1e42d3cd2 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 725.525018] env[59857]: DEBUG nova.policy [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'baef766334764dd9ab481d3a2aacd07b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a9e33b2e4b8c439a8e8a557ddda22fce', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59857) authorize /opt/stack/nova/nova/policy.py:203}} [ 726.116088] env[59857]: DEBUG oslo_concurrency.lockutils [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Acquiring lock "26aa196e-e745-494d-814f-7da3cf18ec14" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 726.116319] env[59857]: DEBUG oslo_concurrency.lockutils [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Lock "26aa196e-e745-494d-814f-7da3cf18ec14" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 726.132684] env[59857]: DEBUG nova.compute.manager [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] Starting instance... {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 726.195577] env[59857]: DEBUG oslo_concurrency.lockutils [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 726.195577] env[59857]: DEBUG oslo_concurrency.lockutils [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 726.196780] env[59857]: INFO nova.compute.claims [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 726.280251] env[59857]: DEBUG nova.network.neutron [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 726.295440] env[59857]: DEBUG oslo_concurrency.lockutils [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Releasing lock "refresh_cache-0fc5de88-13a6-498a-848c-35beb772be65" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 726.295918] env[59857]: DEBUG nova.compute.manager [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] Start destroying the instance on the hypervisor. {{(pid=59857) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 726.296122] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] Destroying instance {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 726.296671] env[59857]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-196779c0-2170-4dd7-9a55-b802a4ef7e9f {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 726.307145] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4fb09611-c763-4e58-8bf7-adc510201036 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 726.337976] env[59857]: WARNING nova.virt.vmwareapi.vmops [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 0fc5de88-13a6-498a-848c-35beb772be65 could not be found. [ 726.338227] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] Instance destroyed {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 726.338411] env[59857]: INFO nova.compute.manager [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] Took 0.04 seconds to destroy the instance on the hypervisor. [ 726.338664] env[59857]: DEBUG oslo.service.loopingcall [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59857) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 726.339238] env[59857]: DEBUG nova.compute.manager [-] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 726.339340] env[59857]: DEBUG nova.network.neutron [-] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 726.427081] env[59857]: DEBUG nova.network.neutron [-] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 726.430796] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c70602f-f723-4ec4-a2b3-817ce2b5b7d4 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 726.434907] env[59857]: DEBUG nova.network.neutron [-] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 726.440218] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c59824f3-91c9-4b30-89a7-04336f3c5742 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 726.445884] env[59857]: INFO nova.compute.manager [-] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] Took 0.11 seconds to deallocate network for instance. [ 726.448285] env[59857]: DEBUG nova.compute.claims [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] Aborting claim: {{(pid=59857) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 726.448989] env[59857]: DEBUG oslo_concurrency.lockutils [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 726.475055] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74cf204f-8a04-42d1-8f0a-b164037fe35e {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 726.484320] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-827e6fc2-f120-4c04-b498-17a66cd4cf30 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 726.501153] env[59857]: DEBUG nova.compute.provider_tree [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 726.512718] env[59857]: DEBUG nova.scheduler.client.report [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 726.535119] env[59857]: DEBUG oslo_concurrency.lockutils [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.340s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 726.535873] env[59857]: DEBUG nova.compute.manager [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] Start building networks asynchronously for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 726.538600] env[59857]: DEBUG oslo_concurrency.lockutils [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.090s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 726.592472] env[59857]: DEBUG nova.compute.utils [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Using /dev/sd instead of None {{(pid=59857) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 726.595439] env[59857]: DEBUG nova.compute.manager [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] Not allocating networking since 'none' was specified. {{(pid=59857) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 726.603537] env[59857]: DEBUG nova.compute.manager [req-795e17d8-30ba-4b32-8dc0-f7a28f26a6eb req-d855297b-8fdb-4694-8a00-9be479e86db0 service nova] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] Received event network-changed-78685b50-e23c-45ff-8b69-8ac77b285c14 {{(pid=59857) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 726.603740] env[59857]: DEBUG nova.compute.manager [req-795e17d8-30ba-4b32-8dc0-f7a28f26a6eb req-d855297b-8fdb-4694-8a00-9be479e86db0 service nova] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] Refreshing instance network info cache due to event network-changed-78685b50-e23c-45ff-8b69-8ac77b285c14. {{(pid=59857) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 726.603953] env[59857]: DEBUG oslo_concurrency.lockutils [req-795e17d8-30ba-4b32-8dc0-f7a28f26a6eb req-d855297b-8fdb-4694-8a00-9be479e86db0 service nova] Acquiring lock "refresh_cache-0fc5de88-13a6-498a-848c-35beb772be65" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 726.604110] env[59857]: DEBUG oslo_concurrency.lockutils [req-795e17d8-30ba-4b32-8dc0-f7a28f26a6eb req-d855297b-8fdb-4694-8a00-9be479e86db0 service nova] Acquired lock "refresh_cache-0fc5de88-13a6-498a-848c-35beb772be65" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 726.604360] env[59857]: DEBUG nova.network.neutron [req-795e17d8-30ba-4b32-8dc0-f7a28f26a6eb req-d855297b-8fdb-4694-8a00-9be479e86db0 service nova] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] Refreshing network info cache for port 78685b50-e23c-45ff-8b69-8ac77b285c14 {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 726.609075] env[59857]: DEBUG nova.compute.manager [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] Start building block device mappings for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 726.688150] env[59857]: DEBUG nova.network.neutron [req-795e17d8-30ba-4b32-8dc0-f7a28f26a6eb req-d855297b-8fdb-4694-8a00-9be479e86db0 service nova] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 726.703404] env[59857]: DEBUG nova.compute.manager [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] Start spawning the instance on the hypervisor. {{(pid=59857) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 726.736119] env[59857]: DEBUG nova.virt.hardware [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-23T14:43:14Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-23T14:42:57Z,direct_url=,disk_format='vmdk',id=4a4a4830-1ff7-4cff-ab75-d665942f46b5,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='88755ffc9d394d14928f0692771e61f5',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-23T14:42:58Z,virtual_size=,visibility=), allow threads: False {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 726.736119] env[59857]: DEBUG nova.virt.hardware [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Flavor limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 726.736294] env[59857]: DEBUG nova.virt.hardware [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Image limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 726.736348] env[59857]: DEBUG nova.virt.hardware [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Flavor pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 726.736474] env[59857]: DEBUG nova.virt.hardware [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Image pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 726.736635] env[59857]: DEBUG nova.virt.hardware [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 726.737271] env[59857]: DEBUG nova.virt.hardware [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 726.737271] env[59857]: DEBUG nova.virt.hardware [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 726.737271] env[59857]: DEBUG nova.virt.hardware [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Got 1 possible topologies {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 726.737431] env[59857]: DEBUG nova.virt.hardware [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 726.737575] env[59857]: DEBUG nova.virt.hardware [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 726.739065] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80ffbe8d-76fe-4775-a0e1-4dd7d1671006 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 726.751189] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-560feec1-16c0-4959-92d3-d98c2c6ec398 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 726.757203] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67737011-883f-4451-883c-f9557ceca397 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 726.770505] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] Instance VIF info [] {{(pid=59857) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 726.776134] env[59857]: DEBUG nova.virt.vmwareapi.vm_util [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Creating folder: Project (02855237761b401fbd41810098ed77e9). Parent ref: group-v286134. {{(pid=59857) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 726.779279] env[59857]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-da13ce4f-c155-44e0-99e2-86c54591e9a1 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 726.781819] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8067c192-e320-48a2-be41-32268b6e67f0 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 726.816730] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82d61ae6-347c-4a9c-8539-412fd6589e0b {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 726.819506] env[59857]: INFO nova.virt.vmwareapi.vm_util [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Created folder: Project (02855237761b401fbd41810098ed77e9) in parent group-v286134. [ 726.819686] env[59857]: DEBUG nova.virt.vmwareapi.vm_util [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Creating folder: Instances. Parent ref: group-v286147. {{(pid=59857) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 726.819919] env[59857]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d0fcf5f0-12f4-4ffd-9aa2-13d69ff5a150 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 726.827023] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-75d7a22c-187d-40db-a014-5cb3fe01a7bf {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 726.832192] env[59857]: INFO nova.virt.vmwareapi.vm_util [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Created folder: Instances in parent group-v286147. [ 726.832310] env[59857]: DEBUG oslo.service.loopingcall [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59857) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 726.832877] env[59857]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] Creating VM on the ESX host {{(pid=59857) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 726.833077] env[59857]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b5a5e268-053e-4b97-b8f9-2640a6e60585 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 726.856152] env[59857]: DEBUG nova.compute.provider_tree [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 726.862516] env[59857]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 726.862516] env[59857]: value = "task-1341433" [ 726.862516] env[59857]: _type = "Task" [ 726.862516] env[59857]: } to complete. {{(pid=59857) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 726.867900] env[59857]: DEBUG nova.scheduler.client.report [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 726.881829] env[59857]: DEBUG oslo_vmware.api [-] Task: {'id': task-1341433, 'name': CreateVM_Task} progress is 5%. {{(pid=59857) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 726.900685] env[59857]: DEBUG oslo_concurrency.lockutils [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.362s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 726.901441] env[59857]: ERROR nova.compute.manager [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 78685b50-e23c-45ff-8b69-8ac77b285c14, please check neutron logs for more information. [ 726.901441] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] Traceback (most recent call last): [ 726.901441] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 726.901441] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] self.driver.spawn(context, instance, image_meta, [ 726.901441] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 726.901441] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] self._vmops.spawn(context, instance, image_meta, injected_files, [ 726.901441] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 726.901441] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] vm_ref = self.build_virtual_machine(instance, [ 726.901441] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 726.901441] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] vif_infos = vmwarevif.get_vif_info(self._session, [ 726.901441] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 726.902023] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] for vif in network_info: [ 726.902023] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 726.902023] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] return self._sync_wrapper(fn, *args, **kwargs) [ 726.902023] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 726.902023] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] self.wait() [ 726.902023] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 726.902023] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] self[:] = self._gt.wait() [ 726.902023] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 726.902023] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] return self._exit_event.wait() [ 726.902023] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 726.902023] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] result = hub.switch() [ 726.902023] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 726.902023] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] return self.greenlet.switch() [ 726.902023] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 726.902610] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] result = function(*args, **kwargs) [ 726.902610] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 726.902610] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] return func(*args, **kwargs) [ 726.902610] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 726.902610] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] raise e [ 726.902610] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 726.902610] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] nwinfo = self.network_api.allocate_for_instance( [ 726.902610] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 726.902610] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] created_port_ids = self._update_ports_for_instance( [ 726.902610] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 726.902610] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] with excutils.save_and_reraise_exception(): [ 726.902610] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 726.902610] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] self.force_reraise() [ 726.903187] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 726.903187] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] raise self.value [ 726.903187] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 726.903187] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] updated_port = self._update_port( [ 726.903187] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 726.903187] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] _ensure_no_port_binding_failure(port) [ 726.903187] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 726.903187] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] raise exception.PortBindingFailed(port_id=port['id']) [ 726.903187] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] nova.exception.PortBindingFailed: Binding failed for port 78685b50-e23c-45ff-8b69-8ac77b285c14, please check neutron logs for more information. [ 726.903187] env[59857]: ERROR nova.compute.manager [instance: 0fc5de88-13a6-498a-848c-35beb772be65] [ 726.903187] env[59857]: DEBUG nova.compute.utils [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] Binding failed for port 78685b50-e23c-45ff-8b69-8ac77b285c14, please check neutron logs for more information. {{(pid=59857) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 726.905908] env[59857]: DEBUG nova.compute.manager [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] Build of instance 0fc5de88-13a6-498a-848c-35beb772be65 was re-scheduled: Binding failed for port 78685b50-e23c-45ff-8b69-8ac77b285c14, please check neutron logs for more information. {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 726.905908] env[59857]: DEBUG nova.compute.manager [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] Unplugging VIFs for instance {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 726.905908] env[59857]: DEBUG oslo_concurrency.lockutils [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Acquiring lock "refresh_cache-0fc5de88-13a6-498a-848c-35beb772be65" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 727.186604] env[59857]: ERROR nova.compute.manager [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 2cc98968-ca9f-4493-94eb-91a79482e1f8, please check neutron logs for more information. [ 727.186604] env[59857]: ERROR nova.compute.manager Traceback (most recent call last): [ 727.186604] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 727.186604] env[59857]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 727.186604] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 727.186604] env[59857]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 727.186604] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 727.186604] env[59857]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 727.186604] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 727.186604] env[59857]: ERROR nova.compute.manager self.force_reraise() [ 727.186604] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 727.186604] env[59857]: ERROR nova.compute.manager raise self.value [ 727.186604] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 727.186604] env[59857]: ERROR nova.compute.manager updated_port = self._update_port( [ 727.186604] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 727.186604] env[59857]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 727.187338] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 727.187338] env[59857]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 727.187338] env[59857]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 2cc98968-ca9f-4493-94eb-91a79482e1f8, please check neutron logs for more information. [ 727.187338] env[59857]: ERROR nova.compute.manager [ 727.187338] env[59857]: Traceback (most recent call last): [ 727.187338] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 727.187338] env[59857]: listener.cb(fileno) [ 727.187338] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 727.187338] env[59857]: result = function(*args, **kwargs) [ 727.187338] env[59857]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 727.187338] env[59857]: return func(*args, **kwargs) [ 727.187338] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 727.187338] env[59857]: raise e [ 727.187338] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 727.187338] env[59857]: nwinfo = self.network_api.allocate_for_instance( [ 727.187338] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 727.187338] env[59857]: created_port_ids = self._update_ports_for_instance( [ 727.187338] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 727.187338] env[59857]: with excutils.save_and_reraise_exception(): [ 727.187338] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 727.187338] env[59857]: self.force_reraise() [ 727.187338] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 727.187338] env[59857]: raise self.value [ 727.187338] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 727.187338] env[59857]: updated_port = self._update_port( [ 727.187338] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 727.187338] env[59857]: _ensure_no_port_binding_failure(port) [ 727.187338] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 727.187338] env[59857]: raise exception.PortBindingFailed(port_id=port['id']) [ 727.188109] env[59857]: nova.exception.PortBindingFailed: Binding failed for port 2cc98968-ca9f-4493-94eb-91a79482e1f8, please check neutron logs for more information. [ 727.188109] env[59857]: Removing descriptor: 15 [ 727.188109] env[59857]: ERROR nova.compute.manager [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 2cc98968-ca9f-4493-94eb-91a79482e1f8, please check neutron logs for more information. [ 727.188109] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] Traceback (most recent call last): [ 727.188109] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 727.188109] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] yield resources [ 727.188109] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 727.188109] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] self.driver.spawn(context, instance, image_meta, [ 727.188109] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 727.188109] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 727.188109] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 727.188109] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] vm_ref = self.build_virtual_machine(instance, [ 727.188423] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 727.188423] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] vif_infos = vmwarevif.get_vif_info(self._session, [ 727.188423] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 727.188423] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] for vif in network_info: [ 727.188423] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 727.188423] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] return self._sync_wrapper(fn, *args, **kwargs) [ 727.188423] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 727.188423] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] self.wait() [ 727.188423] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 727.188423] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] self[:] = self._gt.wait() [ 727.188423] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 727.188423] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] return self._exit_event.wait() [ 727.188423] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 727.188423] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] result = hub.switch() [ 727.188769] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 727.188769] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] return self.greenlet.switch() [ 727.188769] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 727.188769] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] result = function(*args, **kwargs) [ 727.188769] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 727.188769] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] return func(*args, **kwargs) [ 727.188769] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 727.188769] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] raise e [ 727.188769] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 727.188769] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] nwinfo = self.network_api.allocate_for_instance( [ 727.188769] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 727.188769] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] created_port_ids = self._update_ports_for_instance( [ 727.188769] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 727.189106] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] with excutils.save_and_reraise_exception(): [ 727.189106] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 727.189106] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] self.force_reraise() [ 727.189106] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 727.189106] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] raise self.value [ 727.189106] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 727.189106] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] updated_port = self._update_port( [ 727.189106] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 727.189106] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] _ensure_no_port_binding_failure(port) [ 727.189106] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 727.189106] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] raise exception.PortBindingFailed(port_id=port['id']) [ 727.189106] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] nova.exception.PortBindingFailed: Binding failed for port 2cc98968-ca9f-4493-94eb-91a79482e1f8, please check neutron logs for more information. [ 727.189106] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] [ 727.189401] env[59857]: INFO nova.compute.manager [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] Terminating instance [ 727.197024] env[59857]: DEBUG oslo_concurrency.lockutils [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Acquiring lock "refresh_cache-c573864b-774a-4e4d-be80-5bc9bbd1659d" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 727.197024] env[59857]: DEBUG oslo_concurrency.lockutils [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Acquired lock "refresh_cache-c573864b-774a-4e4d-be80-5bc9bbd1659d" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 727.197024] env[59857]: DEBUG nova.network.neutron [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 727.257287] env[59857]: DEBUG oslo_concurrency.lockutils [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Acquiring lock "949f98a2-9316-4cbd-b1e3-b05d08a68997" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 727.257466] env[59857]: DEBUG oslo_concurrency.lockutils [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Lock "949f98a2-9316-4cbd-b1e3-b05d08a68997" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 727.271392] env[59857]: DEBUG nova.compute.manager [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] Starting instance... {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 727.345150] env[59857]: DEBUG oslo_concurrency.lockutils [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 727.345395] env[59857]: DEBUG oslo_concurrency.lockutils [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 727.351180] env[59857]: INFO nova.compute.claims [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 727.371030] env[59857]: DEBUG nova.network.neutron [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 727.379792] env[59857]: DEBUG oslo_vmware.api [-] Task: {'id': task-1341433, 'name': CreateVM_Task, 'duration_secs': 0.299294} completed successfully. {{(pid=59857) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 727.379792] env[59857]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] Created VM on the ESX host {{(pid=59857) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 727.379792] env[59857]: DEBUG oslo_concurrency.lockutils [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/4a4a4830-1ff7-4cff-ab75-d665942f46b5" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 727.379792] env[59857]: DEBUG oslo_concurrency.lockutils [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Acquired lock "[datastore2] devstack-image-cache_base/4a4a4830-1ff7-4cff-ab75-d665942f46b5" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 727.380256] env[59857]: DEBUG oslo_concurrency.lockutils [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/4a4a4830-1ff7-4cff-ab75-d665942f46b5" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 727.380434] env[59857]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-68bda6a8-3ba0-499f-ba60-13e0ce9abb86 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 727.391021] env[59857]: DEBUG oslo_vmware.api [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Waiting for the task: (returnval){ [ 727.391021] env[59857]: value = "session[5271f050-9b9a-180d-7d1b-498ca49bf02c]52b5a889-015d-d0a4-1386-9d1e92fb99ab" [ 727.391021] env[59857]: _type = "Task" [ 727.391021] env[59857]: } to complete. {{(pid=59857) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 727.399864] env[59857]: DEBUG oslo_vmware.api [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Task: {'id': session[5271f050-9b9a-180d-7d1b-498ca49bf02c]52b5a889-015d-d0a4-1386-9d1e92fb99ab, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59857) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 727.579155] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99a21508-535f-40ed-a7b6-bb035d1a7959 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 727.589043] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bfdd8606-44a8-4de8-b6d8-3b627839ba6e {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 727.621913] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6fb9bbcb-44f4-4442-bdd9-ab59fb128b7d {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 727.629449] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1105f133-7da2-426f-a039-1d47ac29c490 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 727.644260] env[59857]: DEBUG nova.compute.provider_tree [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 727.658234] env[59857]: DEBUG nova.scheduler.client.report [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 727.682634] env[59857]: DEBUG oslo_concurrency.lockutils [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.337s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 727.683178] env[59857]: DEBUG nova.compute.manager [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] Start building networks asynchronously for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 727.721868] env[59857]: DEBUG nova.compute.utils [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Using /dev/sd instead of None {{(pid=59857) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 727.726709] env[59857]: DEBUG nova.compute.manager [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] Allocating IP information in the background. {{(pid=59857) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 727.726709] env[59857]: DEBUG nova.network.neutron [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] allocate_for_instance() {{(pid=59857) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 727.733414] env[59857]: DEBUG nova.compute.manager [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] Start building block device mappings for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 727.834753] env[59857]: DEBUG nova.network.neutron [req-795e17d8-30ba-4b32-8dc0-f7a28f26a6eb req-d855297b-8fdb-4694-8a00-9be479e86db0 service nova] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 727.841957] env[59857]: DEBUG nova.compute.manager [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] Start spawning the instance on the hypervisor. {{(pid=59857) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 727.848963] env[59857]: DEBUG oslo_concurrency.lockutils [req-795e17d8-30ba-4b32-8dc0-f7a28f26a6eb req-d855297b-8fdb-4694-8a00-9be479e86db0 service nova] Releasing lock "refresh_cache-0fc5de88-13a6-498a-848c-35beb772be65" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 727.849109] env[59857]: DEBUG nova.compute.manager [req-795e17d8-30ba-4b32-8dc0-f7a28f26a6eb req-d855297b-8fdb-4694-8a00-9be479e86db0 service nova] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] Received event network-vif-deleted-78685b50-e23c-45ff-8b69-8ac77b285c14 {{(pid=59857) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 727.849469] env[59857]: DEBUG oslo_concurrency.lockutils [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Acquired lock "refresh_cache-0fc5de88-13a6-498a-848c-35beb772be65" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 727.849616] env[59857]: DEBUG nova.network.neutron [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 727.871884] env[59857]: DEBUG nova.virt.hardware [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-23T14:43:14Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-23T14:42:57Z,direct_url=,disk_format='vmdk',id=4a4a4830-1ff7-4cff-ab75-d665942f46b5,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='88755ffc9d394d14928f0692771e61f5',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-23T14:42:58Z,virtual_size=,visibility=), allow threads: False {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 727.872132] env[59857]: DEBUG nova.virt.hardware [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Flavor limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 727.872284] env[59857]: DEBUG nova.virt.hardware [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Image limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 727.872638] env[59857]: DEBUG nova.virt.hardware [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Flavor pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 727.872638] env[59857]: DEBUG nova.virt.hardware [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Image pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 727.872734] env[59857]: DEBUG nova.virt.hardware [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 727.872905] env[59857]: DEBUG nova.virt.hardware [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 727.873085] env[59857]: DEBUG nova.virt.hardware [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 727.873255] env[59857]: DEBUG nova.virt.hardware [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Got 1 possible topologies {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 727.873409] env[59857]: DEBUG nova.virt.hardware [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 727.873572] env[59857]: DEBUG nova.virt.hardware [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 727.874684] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8040e012-80ef-4389-a6fc-bfea62640669 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 727.883974] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ddba9b41-81a8-481a-a234-d49880df9802 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 727.906300] env[59857]: DEBUG oslo_concurrency.lockutils [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Releasing lock "[datastore2] devstack-image-cache_base/4a4a4830-1ff7-4cff-ab75-d665942f46b5" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 727.906559] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] Processing image 4a4a4830-1ff7-4cff-ab75-d665942f46b5 {{(pid=59857) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 727.906768] env[59857]: DEBUG oslo_concurrency.lockutils [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/4a4a4830-1ff7-4cff-ab75-d665942f46b5/4a4a4830-1ff7-4cff-ab75-d665942f46b5.vmdk" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 727.951424] env[59857]: DEBUG nova.network.neutron [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 727.984694] env[59857]: DEBUG nova.network.neutron [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] Successfully created port: 5cb065d7-5d51-4bed-96ff-fecc4da6167d {{(pid=59857) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 728.094058] env[59857]: DEBUG nova.network.neutron [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 728.109282] env[59857]: DEBUG oslo_concurrency.lockutils [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Releasing lock "refresh_cache-c573864b-774a-4e4d-be80-5bc9bbd1659d" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 728.109282] env[59857]: DEBUG nova.compute.manager [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] Start destroying the instance on the hypervisor. {{(pid=59857) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 728.109282] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] Destroying instance {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 728.109457] env[59857]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-94ff5cfe-63ee-4280-911c-5108ba65a5fb {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.118988] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7c4c0aa-05e8-4f7c-a57c-54e9508af6a5 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.144131] env[59857]: WARNING nova.virt.vmwareapi.vmops [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance c573864b-774a-4e4d-be80-5bc9bbd1659d could not be found. [ 728.144367] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] Instance destroyed {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 728.144611] env[59857]: INFO nova.compute.manager [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] Took 0.04 seconds to destroy the instance on the hypervisor. [ 728.145306] env[59857]: DEBUG oslo.service.loopingcall [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59857) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 728.146803] env[59857]: DEBUG nova.policy [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6f3bb547edf647c1a12433acc70091dc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8ff7f4f6340440dfbe17b4c3b7a33c1d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59857) authorize /opt/stack/nova/nova/policy.py:203}} [ 728.148291] env[59857]: DEBUG nova.compute.manager [-] [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 728.148709] env[59857]: DEBUG nova.network.neutron [-] [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 728.233858] env[59857]: DEBUG nova.network.neutron [-] [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 728.243188] env[59857]: DEBUG nova.network.neutron [-] [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 728.258477] env[59857]: INFO nova.compute.manager [-] [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] Took 0.11 seconds to deallocate network for instance. [ 728.262340] env[59857]: DEBUG nova.compute.claims [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] Aborting claim: {{(pid=59857) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 728.262533] env[59857]: DEBUG oslo_concurrency.lockutils [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 728.262716] env[59857]: DEBUG oslo_concurrency.lockutils [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 728.443666] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04a1c205-dfb3-4957-9bb2-c7a554014d65 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.451514] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a19bd891-a393-4397-afc6-de7bc996642c {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.480827] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec4d4334-60cc-479f-b7e0-e0b4e7085360 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.488142] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-75fd29f7-9c4d-4128-b3fb-eb2171a88768 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.501279] env[59857]: DEBUG nova.compute.provider_tree [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 728.502944] env[59857]: DEBUG nova.network.neutron [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] Successfully created port: 2af06c29-ed0d-4674-acb8-1a3778b201f6 {{(pid=59857) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 728.514069] env[59857]: DEBUG nova.scheduler.client.report [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 728.527222] env[59857]: DEBUG oslo_concurrency.lockutils [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.264s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 728.527865] env[59857]: ERROR nova.compute.manager [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 2cc98968-ca9f-4493-94eb-91a79482e1f8, please check neutron logs for more information. [ 728.527865] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] Traceback (most recent call last): [ 728.527865] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 728.527865] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] self.driver.spawn(context, instance, image_meta, [ 728.527865] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 728.527865] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 728.527865] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 728.527865] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] vm_ref = self.build_virtual_machine(instance, [ 728.527865] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 728.527865] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] vif_infos = vmwarevif.get_vif_info(self._session, [ 728.527865] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 728.528332] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] for vif in network_info: [ 728.528332] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 728.528332] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] return self._sync_wrapper(fn, *args, **kwargs) [ 728.528332] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 728.528332] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] self.wait() [ 728.528332] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 728.528332] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] self[:] = self._gt.wait() [ 728.528332] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 728.528332] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] return self._exit_event.wait() [ 728.528332] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 728.528332] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] result = hub.switch() [ 728.528332] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 728.528332] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] return self.greenlet.switch() [ 728.528332] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 728.528874] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] result = function(*args, **kwargs) [ 728.528874] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 728.528874] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] return func(*args, **kwargs) [ 728.528874] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 728.528874] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] raise e [ 728.528874] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 728.528874] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] nwinfo = self.network_api.allocate_for_instance( [ 728.528874] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 728.528874] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] created_port_ids = self._update_ports_for_instance( [ 728.528874] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 728.528874] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] with excutils.save_and_reraise_exception(): [ 728.528874] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 728.528874] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] self.force_reraise() [ 728.529364] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 728.529364] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] raise self.value [ 728.529364] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 728.529364] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] updated_port = self._update_port( [ 728.529364] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 728.529364] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] _ensure_no_port_binding_failure(port) [ 728.529364] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 728.529364] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] raise exception.PortBindingFailed(port_id=port['id']) [ 728.529364] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] nova.exception.PortBindingFailed: Binding failed for port 2cc98968-ca9f-4493-94eb-91a79482e1f8, please check neutron logs for more information. [ 728.529364] env[59857]: ERROR nova.compute.manager [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] [ 728.529364] env[59857]: DEBUG nova.compute.utils [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] Binding failed for port 2cc98968-ca9f-4493-94eb-91a79482e1f8, please check neutron logs for more information. {{(pid=59857) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 728.530274] env[59857]: DEBUG nova.compute.manager [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] Build of instance c573864b-774a-4e4d-be80-5bc9bbd1659d was re-scheduled: Binding failed for port 2cc98968-ca9f-4493-94eb-91a79482e1f8, please check neutron logs for more information. {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 728.530679] env[59857]: DEBUG nova.compute.manager [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] Unplugging VIFs for instance {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 728.530898] env[59857]: DEBUG oslo_concurrency.lockutils [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Acquiring lock "refresh_cache-c573864b-774a-4e4d-be80-5bc9bbd1659d" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 728.531046] env[59857]: DEBUG oslo_concurrency.lockutils [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Acquired lock "refresh_cache-c573864b-774a-4e4d-be80-5bc9bbd1659d" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 728.531203] env[59857]: DEBUG nova.network.neutron [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 728.634678] env[59857]: DEBUG nova.network.neutron [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 728.754189] env[59857]: DEBUG nova.network.neutron [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 728.767119] env[59857]: DEBUG oslo_concurrency.lockutils [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Releasing lock "refresh_cache-0fc5de88-13a6-498a-848c-35beb772be65" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 728.767313] env[59857]: DEBUG nova.compute.manager [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 728.767514] env[59857]: DEBUG nova.compute.manager [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 728.767679] env[59857]: DEBUG nova.network.neutron [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 728.893698] env[59857]: DEBUG nova.network.neutron [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 728.902022] env[59857]: DEBUG nova.network.neutron [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 728.911884] env[59857]: INFO nova.compute.manager [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] Took 0.14 seconds to deallocate network for instance. [ 729.018194] env[59857]: INFO nova.scheduler.client.report [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Deleted allocations for instance 0fc5de88-13a6-498a-848c-35beb772be65 [ 729.029282] env[59857]: DEBUG nova.network.neutron [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 729.046767] env[59857]: DEBUG oslo_concurrency.lockutils [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Lock "0fc5de88-13a6-498a-848c-35beb772be65" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 17.138s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 729.057411] env[59857]: DEBUG oslo_concurrency.lockutils [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Releasing lock "refresh_cache-c573864b-774a-4e4d-be80-5bc9bbd1659d" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 729.057894] env[59857]: DEBUG nova.compute.manager [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 729.057894] env[59857]: DEBUG nova.compute.manager [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 729.057984] env[59857]: DEBUG nova.network.neutron [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 729.071293] env[59857]: DEBUG oslo_concurrency.lockutils [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Acquiring lock "197488cc-ac6b-4561-8d57-f372c6493573" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 729.071438] env[59857]: DEBUG oslo_concurrency.lockutils [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Lock "197488cc-ac6b-4561-8d57-f372c6493573" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 729.082627] env[59857]: DEBUG nova.compute.manager [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 197488cc-ac6b-4561-8d57-f372c6493573] Starting instance... {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 729.125308] env[59857]: DEBUG nova.network.neutron [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 729.136434] env[59857]: DEBUG oslo_concurrency.lockutils [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 729.136702] env[59857]: DEBUG oslo_concurrency.lockutils [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 729.138279] env[59857]: INFO nova.compute.claims [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 197488cc-ac6b-4561-8d57-f372c6493573] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 729.143314] env[59857]: DEBUG nova.network.neutron [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 729.151871] env[59857]: INFO nova.compute.manager [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] Took 0.09 seconds to deallocate network for instance. [ 729.279727] env[59857]: INFO nova.scheduler.client.report [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Deleted allocations for instance c573864b-774a-4e4d-be80-5bc9bbd1659d [ 729.312262] env[59857]: DEBUG oslo_concurrency.lockutils [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Lock "c573864b-774a-4e4d-be80-5bc9bbd1659d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 15.827s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 729.384871] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e706b219-24cd-485e-9946-9c7089865130 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.394976] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79c0fb1e-862a-44c4-90a8-8f6f7a3afb47 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.434915] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fae790ae-2772-4199-ad97-8b36fd9215ab {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.441028] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-481ec310-368b-4927-9dde-94466d29f6ee {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.454738] env[59857]: DEBUG nova.compute.provider_tree [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 729.464123] env[59857]: DEBUG nova.scheduler.client.report [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 729.468030] env[59857]: ERROR nova.compute.manager [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port cea10054-d802-4762-920a-926732dcdf98, please check neutron logs for more information. [ 729.468030] env[59857]: ERROR nova.compute.manager Traceback (most recent call last): [ 729.468030] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 729.468030] env[59857]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 729.468030] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 729.468030] env[59857]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 729.468030] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 729.468030] env[59857]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 729.468030] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 729.468030] env[59857]: ERROR nova.compute.manager self.force_reraise() [ 729.468030] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 729.468030] env[59857]: ERROR nova.compute.manager raise self.value [ 729.468030] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 729.468030] env[59857]: ERROR nova.compute.manager updated_port = self._update_port( [ 729.468030] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 729.468030] env[59857]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 729.469565] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 729.469565] env[59857]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 729.469565] env[59857]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port cea10054-d802-4762-920a-926732dcdf98, please check neutron logs for more information. [ 729.469565] env[59857]: ERROR nova.compute.manager [ 729.469565] env[59857]: Traceback (most recent call last): [ 729.469565] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 729.469565] env[59857]: listener.cb(fileno) [ 729.469565] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 729.469565] env[59857]: result = function(*args, **kwargs) [ 729.469565] env[59857]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 729.469565] env[59857]: return func(*args, **kwargs) [ 729.469565] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 729.469565] env[59857]: raise e [ 729.469565] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 729.469565] env[59857]: nwinfo = self.network_api.allocate_for_instance( [ 729.469565] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 729.469565] env[59857]: created_port_ids = self._update_ports_for_instance( [ 729.469565] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 729.469565] env[59857]: with excutils.save_and_reraise_exception(): [ 729.469565] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 729.469565] env[59857]: self.force_reraise() [ 729.469565] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 729.469565] env[59857]: raise self.value [ 729.469565] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 729.469565] env[59857]: updated_port = self._update_port( [ 729.469565] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 729.469565] env[59857]: _ensure_no_port_binding_failure(port) [ 729.469565] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 729.469565] env[59857]: raise exception.PortBindingFailed(port_id=port['id']) [ 729.470393] env[59857]: nova.exception.PortBindingFailed: Binding failed for port cea10054-d802-4762-920a-926732dcdf98, please check neutron logs for more information. [ 729.470393] env[59857]: Removing descriptor: 12 [ 729.470393] env[59857]: ERROR nova.compute.manager [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port cea10054-d802-4762-920a-926732dcdf98, please check neutron logs for more information. [ 729.470393] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] Traceback (most recent call last): [ 729.470393] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 729.470393] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] yield resources [ 729.470393] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 729.470393] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] self.driver.spawn(context, instance, image_meta, [ 729.470393] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 729.470393] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] self._vmops.spawn(context, instance, image_meta, injected_files, [ 729.470393] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 729.470393] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] vm_ref = self.build_virtual_machine(instance, [ 729.470716] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 729.470716] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] vif_infos = vmwarevif.get_vif_info(self._session, [ 729.470716] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 729.470716] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] for vif in network_info: [ 729.470716] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 729.470716] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] return self._sync_wrapper(fn, *args, **kwargs) [ 729.470716] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 729.470716] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] self.wait() [ 729.470716] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 729.470716] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] self[:] = self._gt.wait() [ 729.470716] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 729.470716] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] return self._exit_event.wait() [ 729.470716] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 729.470716] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] result = hub.switch() [ 729.471080] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 729.471080] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] return self.greenlet.switch() [ 729.471080] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 729.471080] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] result = function(*args, **kwargs) [ 729.471080] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 729.471080] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] return func(*args, **kwargs) [ 729.471080] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 729.471080] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] raise e [ 729.471080] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 729.471080] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] nwinfo = self.network_api.allocate_for_instance( [ 729.471080] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 729.471080] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] created_port_ids = self._update_ports_for_instance( [ 729.471080] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 729.471408] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] with excutils.save_and_reraise_exception(): [ 729.471408] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 729.471408] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] self.force_reraise() [ 729.471408] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 729.471408] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] raise self.value [ 729.471408] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 729.471408] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] updated_port = self._update_port( [ 729.471408] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 729.471408] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] _ensure_no_port_binding_failure(port) [ 729.471408] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 729.471408] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] raise exception.PortBindingFailed(port_id=port['id']) [ 729.471408] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] nova.exception.PortBindingFailed: Binding failed for port cea10054-d802-4762-920a-926732dcdf98, please check neutron logs for more information. [ 729.471408] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] [ 729.471744] env[59857]: INFO nova.compute.manager [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] Terminating instance [ 729.471744] env[59857]: DEBUG oslo_concurrency.lockutils [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Acquiring lock "refresh_cache-72b94cae-d12d-4228-8ca0-20fde3095c38" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 729.471744] env[59857]: DEBUG oslo_concurrency.lockutils [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Acquired lock "refresh_cache-72b94cae-d12d-4228-8ca0-20fde3095c38" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 729.471744] env[59857]: DEBUG nova.network.neutron [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 729.482047] env[59857]: DEBUG oslo_concurrency.lockutils [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.345s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 729.482047] env[59857]: DEBUG nova.compute.manager [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 197488cc-ac6b-4561-8d57-f372c6493573] Start building networks asynchronously for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 729.528350] env[59857]: DEBUG nova.compute.utils [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Using /dev/sd instead of None {{(pid=59857) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 729.529777] env[59857]: DEBUG nova.compute.manager [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 197488cc-ac6b-4561-8d57-f372c6493573] Allocating IP information in the background. {{(pid=59857) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 729.529777] env[59857]: DEBUG nova.network.neutron [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 197488cc-ac6b-4561-8d57-f372c6493573] allocate_for_instance() {{(pid=59857) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 729.533726] env[59857]: DEBUG nova.network.neutron [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 729.542695] env[59857]: DEBUG nova.compute.manager [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 197488cc-ac6b-4561-8d57-f372c6493573] Start building block device mappings for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 729.621894] env[59857]: DEBUG nova.compute.manager [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 197488cc-ac6b-4561-8d57-f372c6493573] Start spawning the instance on the hypervisor. {{(pid=59857) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 729.654571] env[59857]: DEBUG nova.virt.hardware [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-23T14:43:14Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-23T14:42:57Z,direct_url=,disk_format='vmdk',id=4a4a4830-1ff7-4cff-ab75-d665942f46b5,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='88755ffc9d394d14928f0692771e61f5',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-23T14:42:58Z,virtual_size=,visibility=), allow threads: False {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 729.654571] env[59857]: DEBUG nova.virt.hardware [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Flavor limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 729.654571] env[59857]: DEBUG nova.virt.hardware [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Image limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 729.654860] env[59857]: DEBUG nova.virt.hardware [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Flavor pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 729.654860] env[59857]: DEBUG nova.virt.hardware [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Image pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 729.654860] env[59857]: DEBUG nova.virt.hardware [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 729.655722] env[59857]: DEBUG nova.virt.hardware [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 729.655722] env[59857]: DEBUG nova.virt.hardware [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 729.655722] env[59857]: DEBUG nova.virt.hardware [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Got 1 possible topologies {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 729.655722] env[59857]: DEBUG nova.virt.hardware [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 729.655722] env[59857]: DEBUG nova.virt.hardware [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 729.656843] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d98f31a-d201-44a8-9ebe-68543da958eb {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.667073] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18edf8fd-1d5b-4488-a552-bac72473451b {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.888208] env[59857]: DEBUG nova.policy [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9a2e115241bf4f4491e4736c14c8c75f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '38f2b7c76cb04e49b1b8ac75980011b2', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59857) authorize /opt/stack/nova/nova/policy.py:203}} [ 730.002177] env[59857]: DEBUG nova.network.neutron [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 730.012272] env[59857]: DEBUG oslo_concurrency.lockutils [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Releasing lock "refresh_cache-72b94cae-d12d-4228-8ca0-20fde3095c38" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 730.012791] env[59857]: DEBUG nova.compute.manager [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] Start destroying the instance on the hypervisor. {{(pid=59857) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 730.013123] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] Destroying instance {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 730.013681] env[59857]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-56b68099-07d5-4697-99d4-d726ab7c0f65 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 730.024480] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b9857f7d-65be-4af2-9748-9d0b39766b4c {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 730.052523] env[59857]: WARNING nova.virt.vmwareapi.vmops [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 72b94cae-d12d-4228-8ca0-20fde3095c38 could not be found. [ 730.052523] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] Instance destroyed {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 730.052523] env[59857]: INFO nova.compute.manager [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] Took 0.04 seconds to destroy the instance on the hypervisor. [ 730.052523] env[59857]: DEBUG oslo.service.loopingcall [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59857) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 730.052808] env[59857]: DEBUG nova.compute.manager [-] [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 730.053074] env[59857]: DEBUG nova.network.neutron [-] [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 730.102712] env[59857]: DEBUG nova.network.neutron [-] [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 730.110429] env[59857]: DEBUG nova.network.neutron [-] [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 730.122240] env[59857]: INFO nova.compute.manager [-] [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] Took 0.07 seconds to deallocate network for instance. [ 730.124569] env[59857]: DEBUG nova.compute.claims [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] Aborting claim: {{(pid=59857) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 730.124842] env[59857]: DEBUG oslo_concurrency.lockutils [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 730.124951] env[59857]: DEBUG oslo_concurrency.lockutils [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 730.168975] env[59857]: DEBUG nova.network.neutron [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] Successfully created port: 33db9413-9863-4b89-8ca0-4838adac1c47 {{(pid=59857) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 730.321625] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a78d18bb-d29f-44c1-babf-fbe9a5c1c8e6 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 730.329573] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-353ce041-2650-4903-9ddc-03fcbb2984b5 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 730.369063] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-556773b4-ec12-4531-8ad6-de16f66eaa9b {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 730.379157] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b9a49697-a684-4c1c-98ef-7465ef18a6b8 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 730.394014] env[59857]: DEBUG nova.compute.provider_tree [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 730.410287] env[59857]: DEBUG nova.scheduler.client.report [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 730.426319] env[59857]: DEBUG oslo_concurrency.lockutils [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.301s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 730.426978] env[59857]: ERROR nova.compute.manager [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port cea10054-d802-4762-920a-926732dcdf98, please check neutron logs for more information. [ 730.426978] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] Traceback (most recent call last): [ 730.426978] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 730.426978] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] self.driver.spawn(context, instance, image_meta, [ 730.426978] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 730.426978] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] self._vmops.spawn(context, instance, image_meta, injected_files, [ 730.426978] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 730.426978] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] vm_ref = self.build_virtual_machine(instance, [ 730.426978] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 730.426978] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] vif_infos = vmwarevif.get_vif_info(self._session, [ 730.426978] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 730.427354] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] for vif in network_info: [ 730.427354] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 730.427354] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] return self._sync_wrapper(fn, *args, **kwargs) [ 730.427354] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 730.427354] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] self.wait() [ 730.427354] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 730.427354] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] self[:] = self._gt.wait() [ 730.427354] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 730.427354] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] return self._exit_event.wait() [ 730.427354] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 730.427354] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] result = hub.switch() [ 730.427354] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 730.427354] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] return self.greenlet.switch() [ 730.427354] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 730.427769] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] result = function(*args, **kwargs) [ 730.427769] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 730.427769] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] return func(*args, **kwargs) [ 730.427769] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 730.427769] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] raise e [ 730.427769] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 730.427769] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] nwinfo = self.network_api.allocate_for_instance( [ 730.427769] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 730.427769] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] created_port_ids = self._update_ports_for_instance( [ 730.427769] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 730.427769] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] with excutils.save_and_reraise_exception(): [ 730.427769] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 730.427769] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] self.force_reraise() [ 730.428118] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 730.428118] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] raise self.value [ 730.428118] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 730.428118] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] updated_port = self._update_port( [ 730.428118] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 730.428118] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] _ensure_no_port_binding_failure(port) [ 730.428118] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 730.428118] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] raise exception.PortBindingFailed(port_id=port['id']) [ 730.428118] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] nova.exception.PortBindingFailed: Binding failed for port cea10054-d802-4762-920a-926732dcdf98, please check neutron logs for more information. [ 730.428118] env[59857]: ERROR nova.compute.manager [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] [ 730.428381] env[59857]: DEBUG nova.compute.utils [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] Binding failed for port cea10054-d802-4762-920a-926732dcdf98, please check neutron logs for more information. {{(pid=59857) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 730.430444] env[59857]: DEBUG nova.compute.manager [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] Build of instance 72b94cae-d12d-4228-8ca0-20fde3095c38 was re-scheduled: Binding failed for port cea10054-d802-4762-920a-926732dcdf98, please check neutron logs for more information. {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 730.430884] env[59857]: DEBUG nova.compute.manager [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] Unplugging VIFs for instance {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 730.431113] env[59857]: DEBUG oslo_concurrency.lockutils [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Acquiring lock "refresh_cache-72b94cae-d12d-4228-8ca0-20fde3095c38" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 730.431250] env[59857]: DEBUG oslo_concurrency.lockutils [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Acquired lock "refresh_cache-72b94cae-d12d-4228-8ca0-20fde3095c38" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 730.431405] env[59857]: DEBUG nova.network.neutron [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 730.562999] env[59857]: DEBUG nova.network.neutron [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 731.200411] env[59857]: DEBUG nova.network.neutron [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 731.215994] env[59857]: DEBUG oslo_concurrency.lockutils [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Releasing lock "refresh_cache-72b94cae-d12d-4228-8ca0-20fde3095c38" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 731.215994] env[59857]: DEBUG nova.compute.manager [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 731.215994] env[59857]: DEBUG nova.compute.manager [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 731.215994] env[59857]: DEBUG nova.network.neutron [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 731.253187] env[59857]: DEBUG nova.network.neutron [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 197488cc-ac6b-4561-8d57-f372c6493573] Successfully created port: 573439c9-df7f-4b26-8c67-091dcc6e41d9 {{(pid=59857) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 731.260449] env[59857]: DEBUG nova.network.neutron [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 731.270828] env[59857]: DEBUG nova.network.neutron [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 731.280201] env[59857]: INFO nova.compute.manager [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] Took 0.06 seconds to deallocate network for instance. [ 731.383227] env[59857]: INFO nova.scheduler.client.report [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Deleted allocations for instance 72b94cae-d12d-4228-8ca0-20fde3095c38 [ 731.402338] env[59857]: DEBUG oslo_concurrency.lockutils [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Lock "72b94cae-d12d-4228-8ca0-20fde3095c38" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 15.053s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 733.892387] env[59857]: DEBUG nova.compute.manager [req-b165e3e3-606b-40f0-8855-c99200d74d3a req-2b7717c6-dd1e-4e5e-b869-0cdc04adb1b3 service nova] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] Received event network-changed-5cb065d7-5d51-4bed-96ff-fecc4da6167d {{(pid=59857) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 733.892638] env[59857]: DEBUG nova.compute.manager [req-b165e3e3-606b-40f0-8855-c99200d74d3a req-2b7717c6-dd1e-4e5e-b869-0cdc04adb1b3 service nova] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] Refreshing instance network info cache due to event network-changed-5cb065d7-5d51-4bed-96ff-fecc4da6167d. {{(pid=59857) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 733.892764] env[59857]: DEBUG oslo_concurrency.lockutils [req-b165e3e3-606b-40f0-8855-c99200d74d3a req-2b7717c6-dd1e-4e5e-b869-0cdc04adb1b3 service nova] Acquiring lock "refresh_cache-4e0befc8-76e7-484d-957e-55b0aaedc2c6" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 733.892896] env[59857]: DEBUG oslo_concurrency.lockutils [req-b165e3e3-606b-40f0-8855-c99200d74d3a req-2b7717c6-dd1e-4e5e-b869-0cdc04adb1b3 service nova] Acquired lock "refresh_cache-4e0befc8-76e7-484d-957e-55b0aaedc2c6" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 733.893065] env[59857]: DEBUG nova.network.neutron [req-b165e3e3-606b-40f0-8855-c99200d74d3a req-2b7717c6-dd1e-4e5e-b869-0cdc04adb1b3 service nova] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] Refreshing network info cache for port 5cb065d7-5d51-4bed-96ff-fecc4da6167d {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 733.962333] env[59857]: DEBUG nova.network.neutron [req-b165e3e3-606b-40f0-8855-c99200d74d3a req-2b7717c6-dd1e-4e5e-b869-0cdc04adb1b3 service nova] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 734.071814] env[59857]: ERROR nova.compute.manager [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 5cb065d7-5d51-4bed-96ff-fecc4da6167d, please check neutron logs for more information. [ 734.071814] env[59857]: ERROR nova.compute.manager Traceback (most recent call last): [ 734.071814] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 734.071814] env[59857]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 734.071814] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 734.071814] env[59857]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 734.071814] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 734.071814] env[59857]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 734.071814] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 734.071814] env[59857]: ERROR nova.compute.manager self.force_reraise() [ 734.071814] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 734.071814] env[59857]: ERROR nova.compute.manager raise self.value [ 734.071814] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 734.071814] env[59857]: ERROR nova.compute.manager updated_port = self._update_port( [ 734.071814] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 734.071814] env[59857]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 734.073472] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 734.073472] env[59857]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 734.073472] env[59857]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 5cb065d7-5d51-4bed-96ff-fecc4da6167d, please check neutron logs for more information. [ 734.073472] env[59857]: ERROR nova.compute.manager [ 734.073472] env[59857]: Traceback (most recent call last): [ 734.073472] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 734.073472] env[59857]: listener.cb(fileno) [ 734.073472] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 734.073472] env[59857]: result = function(*args, **kwargs) [ 734.073472] env[59857]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 734.073472] env[59857]: return func(*args, **kwargs) [ 734.073472] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 734.073472] env[59857]: raise e [ 734.073472] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 734.073472] env[59857]: nwinfo = self.network_api.allocate_for_instance( [ 734.073472] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 734.073472] env[59857]: created_port_ids = self._update_ports_for_instance( [ 734.073472] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 734.073472] env[59857]: with excutils.save_and_reraise_exception(): [ 734.073472] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 734.073472] env[59857]: self.force_reraise() [ 734.073472] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 734.073472] env[59857]: raise self.value [ 734.073472] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 734.073472] env[59857]: updated_port = self._update_port( [ 734.073472] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 734.073472] env[59857]: _ensure_no_port_binding_failure(port) [ 734.073472] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 734.073472] env[59857]: raise exception.PortBindingFailed(port_id=port['id']) [ 734.076150] env[59857]: nova.exception.PortBindingFailed: Binding failed for port 5cb065d7-5d51-4bed-96ff-fecc4da6167d, please check neutron logs for more information. [ 734.076150] env[59857]: Removing descriptor: 21 [ 734.076150] env[59857]: ERROR nova.compute.manager [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 5cb065d7-5d51-4bed-96ff-fecc4da6167d, please check neutron logs for more information. [ 734.076150] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] Traceback (most recent call last): [ 734.076150] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 734.076150] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] yield resources [ 734.076150] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 734.076150] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] self.driver.spawn(context, instance, image_meta, [ 734.076150] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 734.076150] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 734.076150] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 734.076150] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] vm_ref = self.build_virtual_machine(instance, [ 734.076561] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 734.076561] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] vif_infos = vmwarevif.get_vif_info(self._session, [ 734.076561] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 734.076561] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] for vif in network_info: [ 734.076561] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 734.076561] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] return self._sync_wrapper(fn, *args, **kwargs) [ 734.076561] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 734.076561] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] self.wait() [ 734.076561] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 734.076561] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] self[:] = self._gt.wait() [ 734.076561] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 734.076561] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] return self._exit_event.wait() [ 734.076561] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 734.076561] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] result = hub.switch() [ 734.081536] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 734.081536] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] return self.greenlet.switch() [ 734.081536] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 734.081536] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] result = function(*args, **kwargs) [ 734.081536] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 734.081536] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] return func(*args, **kwargs) [ 734.081536] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 734.081536] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] raise e [ 734.081536] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 734.081536] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] nwinfo = self.network_api.allocate_for_instance( [ 734.081536] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 734.081536] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] created_port_ids = self._update_ports_for_instance( [ 734.081536] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 734.082620] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] with excutils.save_and_reraise_exception(): [ 734.082620] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 734.082620] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] self.force_reraise() [ 734.082620] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 734.082620] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] raise self.value [ 734.082620] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 734.082620] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] updated_port = self._update_port( [ 734.082620] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 734.082620] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] _ensure_no_port_binding_failure(port) [ 734.082620] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 734.082620] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] raise exception.PortBindingFailed(port_id=port['id']) [ 734.082620] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] nova.exception.PortBindingFailed: Binding failed for port 5cb065d7-5d51-4bed-96ff-fecc4da6167d, please check neutron logs for more information. [ 734.082620] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] [ 734.083025] env[59857]: INFO nova.compute.manager [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] Terminating instance [ 734.083025] env[59857]: DEBUG oslo_concurrency.lockutils [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "refresh_cache-4e0befc8-76e7-484d-957e-55b0aaedc2c6" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 734.287476] env[59857]: ERROR nova.compute.manager [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 2af06c29-ed0d-4674-acb8-1a3778b201f6, please check neutron logs for more information. [ 734.287476] env[59857]: ERROR nova.compute.manager Traceback (most recent call last): [ 734.287476] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 734.287476] env[59857]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 734.287476] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 734.287476] env[59857]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 734.287476] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 734.287476] env[59857]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 734.287476] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 734.287476] env[59857]: ERROR nova.compute.manager self.force_reraise() [ 734.287476] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 734.287476] env[59857]: ERROR nova.compute.manager raise self.value [ 734.287476] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 734.287476] env[59857]: ERROR nova.compute.manager updated_port = self._update_port( [ 734.287476] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 734.287476] env[59857]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 734.287959] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 734.287959] env[59857]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 734.287959] env[59857]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 2af06c29-ed0d-4674-acb8-1a3778b201f6, please check neutron logs for more information. [ 734.287959] env[59857]: ERROR nova.compute.manager [ 734.287959] env[59857]: Traceback (most recent call last): [ 734.288189] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 734.288189] env[59857]: listener.cb(fileno) [ 734.288189] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 734.288189] env[59857]: result = function(*args, **kwargs) [ 734.288189] env[59857]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 734.288189] env[59857]: return func(*args, **kwargs) [ 734.288189] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 734.288189] env[59857]: raise e [ 734.288189] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 734.288189] env[59857]: nwinfo = self.network_api.allocate_for_instance( [ 734.288189] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 734.288189] env[59857]: created_port_ids = self._update_ports_for_instance( [ 734.288189] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 734.288189] env[59857]: with excutils.save_and_reraise_exception(): [ 734.288189] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 734.288189] env[59857]: self.force_reraise() [ 734.288189] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 734.288189] env[59857]: raise self.value [ 734.288189] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 734.288189] env[59857]: updated_port = self._update_port( [ 734.288189] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 734.288189] env[59857]: _ensure_no_port_binding_failure(port) [ 734.288189] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 734.288189] env[59857]: raise exception.PortBindingFailed(port_id=port['id']) [ 734.288844] env[59857]: nova.exception.PortBindingFailed: Binding failed for port 2af06c29-ed0d-4674-acb8-1a3778b201f6, please check neutron logs for more information. [ 734.288844] env[59857]: Removing descriptor: 19 [ 734.289638] env[59857]: ERROR nova.compute.manager [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 2af06c29-ed0d-4674-acb8-1a3778b201f6, please check neutron logs for more information. [ 734.289638] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] Traceback (most recent call last): [ 734.289638] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 734.289638] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] yield resources [ 734.289638] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 734.289638] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] self.driver.spawn(context, instance, image_meta, [ 734.289638] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 734.289638] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] self._vmops.spawn(context, instance, image_meta, injected_files, [ 734.289638] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 734.289638] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] vm_ref = self.build_virtual_machine(instance, [ 734.289638] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 734.290020] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] vif_infos = vmwarevif.get_vif_info(self._session, [ 734.290020] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 734.290020] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] for vif in network_info: [ 734.290020] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 734.290020] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] return self._sync_wrapper(fn, *args, **kwargs) [ 734.290020] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 734.290020] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] self.wait() [ 734.290020] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 734.290020] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] self[:] = self._gt.wait() [ 734.290020] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 734.290020] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] return self._exit_event.wait() [ 734.290020] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 734.290020] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] result = hub.switch() [ 734.290020] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 734.290447] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] return self.greenlet.switch() [ 734.290447] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 734.290447] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] result = function(*args, **kwargs) [ 734.290447] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 734.290447] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] return func(*args, **kwargs) [ 734.290447] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 734.290447] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] raise e [ 734.290447] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 734.290447] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] nwinfo = self.network_api.allocate_for_instance( [ 734.290447] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 734.290447] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] created_port_ids = self._update_ports_for_instance( [ 734.290447] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 734.290447] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] with excutils.save_and_reraise_exception(): [ 734.290845] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 734.290845] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] self.force_reraise() [ 734.290845] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 734.290845] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] raise self.value [ 734.290845] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 734.290845] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] updated_port = self._update_port( [ 734.290845] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 734.290845] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] _ensure_no_port_binding_failure(port) [ 734.290845] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 734.290845] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] raise exception.PortBindingFailed(port_id=port['id']) [ 734.290845] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] nova.exception.PortBindingFailed: Binding failed for port 2af06c29-ed0d-4674-acb8-1a3778b201f6, please check neutron logs for more information. [ 734.290845] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] [ 734.291198] env[59857]: INFO nova.compute.manager [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] Terminating instance [ 734.294052] env[59857]: DEBUG oslo_concurrency.lockutils [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Acquiring lock "refresh_cache-8d391d2c-ea85-47d4-a140-03ea6da1c101" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 734.294215] env[59857]: DEBUG oslo_concurrency.lockutils [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Acquired lock "refresh_cache-8d391d2c-ea85-47d4-a140-03ea6da1c101" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 734.294380] env[59857]: DEBUG nova.network.neutron [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 734.371277] env[59857]: DEBUG nova.network.neutron [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 734.373642] env[59857]: DEBUG nova.network.neutron [req-b165e3e3-606b-40f0-8855-c99200d74d3a req-2b7717c6-dd1e-4e5e-b869-0cdc04adb1b3 service nova] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 734.385937] env[59857]: DEBUG oslo_concurrency.lockutils [req-b165e3e3-606b-40f0-8855-c99200d74d3a req-2b7717c6-dd1e-4e5e-b869-0cdc04adb1b3 service nova] Releasing lock "refresh_cache-4e0befc8-76e7-484d-957e-55b0aaedc2c6" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 734.386675] env[59857]: DEBUG oslo_concurrency.lockutils [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquired lock "refresh_cache-4e0befc8-76e7-484d-957e-55b0aaedc2c6" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 734.386897] env[59857]: DEBUG nova.network.neutron [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 734.492543] env[59857]: DEBUG nova.network.neutron [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 735.104689] env[59857]: DEBUG nova.network.neutron [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 735.112962] env[59857]: DEBUG oslo_concurrency.lockutils [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Releasing lock "refresh_cache-8d391d2c-ea85-47d4-a140-03ea6da1c101" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 735.113865] env[59857]: DEBUG nova.compute.manager [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] Start destroying the instance on the hypervisor. {{(pid=59857) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 735.113865] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] Destroying instance {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 735.114055] env[59857]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-77de4c64-744a-459f-a156-c035815bd21e {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.124275] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9b5cceb-4e9d-4579-aef9-b15f72d07785 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.152756] env[59857]: WARNING nova.virt.vmwareapi.vmops [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 8d391d2c-ea85-47d4-a140-03ea6da1c101 could not be found. [ 735.153100] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] Instance destroyed {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 735.153177] env[59857]: INFO nova.compute.manager [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] Took 0.04 seconds to destroy the instance on the hypervisor. [ 735.153490] env[59857]: DEBUG oslo.service.loopingcall [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59857) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 735.154288] env[59857]: DEBUG nova.network.neutron [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 735.157828] env[59857]: DEBUG nova.compute.manager [-] [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 735.157828] env[59857]: DEBUG nova.network.neutron [-] [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 735.165395] env[59857]: DEBUG oslo_concurrency.lockutils [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Releasing lock "refresh_cache-4e0befc8-76e7-484d-957e-55b0aaedc2c6" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 735.165876] env[59857]: DEBUG nova.compute.manager [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] Start destroying the instance on the hypervisor. {{(pid=59857) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 735.166091] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] Destroying instance {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 735.166623] env[59857]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-59af359d-1e8f-4c00-99c6-d96ab144749d {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.175453] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a55a281-be2a-4eb8-91a8-f9e3683d3454 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.199558] env[59857]: WARNING nova.virt.vmwareapi.vmops [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 4e0befc8-76e7-484d-957e-55b0aaedc2c6 could not be found. [ 735.199785] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] Instance destroyed {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 735.199959] env[59857]: INFO nova.compute.manager [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] Took 0.03 seconds to destroy the instance on the hypervisor. [ 735.200204] env[59857]: DEBUG oslo.service.loopingcall [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59857) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 735.200420] env[59857]: DEBUG nova.compute.manager [-] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 735.200515] env[59857]: DEBUG nova.network.neutron [-] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 735.213438] env[59857]: DEBUG nova.network.neutron [-] [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 735.221529] env[59857]: DEBUG nova.network.neutron [-] [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 735.229393] env[59857]: INFO nova.compute.manager [-] [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] Took 0.07 seconds to deallocate network for instance. [ 735.231815] env[59857]: DEBUG nova.compute.claims [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] Aborting claim: {{(pid=59857) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 735.232062] env[59857]: DEBUG oslo_concurrency.lockutils [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 735.232326] env[59857]: DEBUG oslo_concurrency.lockutils [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 735.278476] env[59857]: DEBUG nova.network.neutron [-] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 735.285707] env[59857]: DEBUG nova.network.neutron [-] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 735.295838] env[59857]: INFO nova.compute.manager [-] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] Took 0.10 seconds to deallocate network for instance. [ 735.297844] env[59857]: DEBUG nova.compute.claims [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] Aborting claim: {{(pid=59857) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 735.299505] env[59857]: DEBUG oslo_concurrency.lockutils [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 735.396069] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aac11f5e-b98a-41e9-a962-d42144a79728 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.402742] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6edc587e-df95-4727-9e80-56ac3a6a79dd {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.442118] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b901fde-e87f-45bf-896e-cc52ae13441b {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.450225] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8b71cd7-8636-4f0a-ad92-61942ef9f918 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.467502] env[59857]: DEBUG nova.compute.provider_tree [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 735.481548] env[59857]: DEBUG nova.scheduler.client.report [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 735.498487] env[59857]: DEBUG oslo_concurrency.lockutils [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.266s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 735.499241] env[59857]: ERROR nova.compute.manager [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 2af06c29-ed0d-4674-acb8-1a3778b201f6, please check neutron logs for more information. [ 735.499241] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] Traceback (most recent call last): [ 735.499241] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 735.499241] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] self.driver.spawn(context, instance, image_meta, [ 735.499241] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 735.499241] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] self._vmops.spawn(context, instance, image_meta, injected_files, [ 735.499241] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 735.499241] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] vm_ref = self.build_virtual_machine(instance, [ 735.499241] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 735.499241] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] vif_infos = vmwarevif.get_vif_info(self._session, [ 735.499241] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 735.499674] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] for vif in network_info: [ 735.499674] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 735.499674] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] return self._sync_wrapper(fn, *args, **kwargs) [ 735.499674] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 735.499674] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] self.wait() [ 735.499674] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 735.499674] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] self[:] = self._gt.wait() [ 735.499674] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 735.499674] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] return self._exit_event.wait() [ 735.499674] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 735.499674] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] result = hub.switch() [ 735.499674] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 735.499674] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] return self.greenlet.switch() [ 735.499674] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 735.499998] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] result = function(*args, **kwargs) [ 735.499998] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 735.499998] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] return func(*args, **kwargs) [ 735.499998] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 735.499998] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] raise e [ 735.499998] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 735.499998] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] nwinfo = self.network_api.allocate_for_instance( [ 735.499998] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 735.499998] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] created_port_ids = self._update_ports_for_instance( [ 735.499998] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 735.499998] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] with excutils.save_and_reraise_exception(): [ 735.499998] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 735.499998] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] self.force_reraise() [ 735.500322] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 735.500322] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] raise self.value [ 735.500322] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 735.500322] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] updated_port = self._update_port( [ 735.500322] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 735.500322] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] _ensure_no_port_binding_failure(port) [ 735.500322] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 735.500322] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] raise exception.PortBindingFailed(port_id=port['id']) [ 735.500322] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] nova.exception.PortBindingFailed: Binding failed for port 2af06c29-ed0d-4674-acb8-1a3778b201f6, please check neutron logs for more information. [ 735.500322] env[59857]: ERROR nova.compute.manager [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] [ 735.500322] env[59857]: DEBUG nova.compute.utils [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] Binding failed for port 2af06c29-ed0d-4674-acb8-1a3778b201f6, please check neutron logs for more information. {{(pid=59857) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 735.501642] env[59857]: DEBUG oslo_concurrency.lockutils [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.203s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 735.505321] env[59857]: DEBUG nova.compute.manager [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] Build of instance 8d391d2c-ea85-47d4-a140-03ea6da1c101 was re-scheduled: Binding failed for port 2af06c29-ed0d-4674-acb8-1a3778b201f6, please check neutron logs for more information. {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 735.505795] env[59857]: DEBUG nova.compute.manager [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] Unplugging VIFs for instance {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 735.506067] env[59857]: DEBUG oslo_concurrency.lockutils [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Acquiring lock "refresh_cache-8d391d2c-ea85-47d4-a140-03ea6da1c101" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 735.506250] env[59857]: DEBUG oslo_concurrency.lockutils [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Acquired lock "refresh_cache-8d391d2c-ea85-47d4-a140-03ea6da1c101" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 735.506488] env[59857]: DEBUG nova.network.neutron [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 735.585825] env[59857]: DEBUG nova.network.neutron [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 735.651981] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e97ac82-0716-481a-ae03-50fbb0669bd5 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.661025] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ce6ba0b-fddb-4bd8-9889-e5e553e87703 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.704112] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0d888bf-accf-4b79-8658-edb05049b035 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.712131] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0264309-27ad-4b8a-bf28-1c13b2f82e05 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.725674] env[59857]: DEBUG nova.compute.provider_tree [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 735.736017] env[59857]: DEBUG nova.scheduler.client.report [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 735.760108] env[59857]: DEBUG oslo_concurrency.lockutils [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.259s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 735.760682] env[59857]: ERROR nova.compute.manager [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 5cb065d7-5d51-4bed-96ff-fecc4da6167d, please check neutron logs for more information. [ 735.760682] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] Traceback (most recent call last): [ 735.760682] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 735.760682] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] self.driver.spawn(context, instance, image_meta, [ 735.760682] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 735.760682] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 735.760682] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 735.760682] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] vm_ref = self.build_virtual_machine(instance, [ 735.760682] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 735.760682] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] vif_infos = vmwarevif.get_vif_info(self._session, [ 735.760682] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 735.761079] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] for vif in network_info: [ 735.761079] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 735.761079] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] return self._sync_wrapper(fn, *args, **kwargs) [ 735.761079] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 735.761079] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] self.wait() [ 735.761079] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 735.761079] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] self[:] = self._gt.wait() [ 735.761079] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 735.761079] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] return self._exit_event.wait() [ 735.761079] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 735.761079] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] result = hub.switch() [ 735.761079] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 735.761079] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] return self.greenlet.switch() [ 735.761079] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 735.761541] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] result = function(*args, **kwargs) [ 735.761541] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 735.761541] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] return func(*args, **kwargs) [ 735.761541] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 735.761541] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] raise e [ 735.761541] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 735.761541] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] nwinfo = self.network_api.allocate_for_instance( [ 735.761541] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 735.761541] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] created_port_ids = self._update_ports_for_instance( [ 735.761541] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 735.761541] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] with excutils.save_and_reraise_exception(): [ 735.761541] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 735.761541] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] self.force_reraise() [ 735.761946] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 735.761946] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] raise self.value [ 735.761946] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 735.761946] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] updated_port = self._update_port( [ 735.761946] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 735.761946] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] _ensure_no_port_binding_failure(port) [ 735.761946] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 735.761946] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] raise exception.PortBindingFailed(port_id=port['id']) [ 735.761946] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] nova.exception.PortBindingFailed: Binding failed for port 5cb065d7-5d51-4bed-96ff-fecc4da6167d, please check neutron logs for more information. [ 735.761946] env[59857]: ERROR nova.compute.manager [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] [ 735.761946] env[59857]: DEBUG nova.compute.utils [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] Binding failed for port 5cb065d7-5d51-4bed-96ff-fecc4da6167d, please check neutron logs for more information. {{(pid=59857) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 735.763719] env[59857]: DEBUG nova.compute.manager [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] Build of instance 4e0befc8-76e7-484d-957e-55b0aaedc2c6 was re-scheduled: Binding failed for port 5cb065d7-5d51-4bed-96ff-fecc4da6167d, please check neutron logs for more information. {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 735.764347] env[59857]: DEBUG nova.compute.manager [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] Unplugging VIFs for instance {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 735.764492] env[59857]: DEBUG oslo_concurrency.lockutils [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "refresh_cache-4e0befc8-76e7-484d-957e-55b0aaedc2c6" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 735.764665] env[59857]: DEBUG oslo_concurrency.lockutils [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquired lock "refresh_cache-4e0befc8-76e7-484d-957e-55b0aaedc2c6" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 735.764849] env[59857]: DEBUG nova.network.neutron [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 736.020171] env[59857]: DEBUG nova.network.neutron [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 736.197089] env[59857]: DEBUG nova.network.neutron [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 736.218024] env[59857]: DEBUG oslo_concurrency.lockutils [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Releasing lock "refresh_cache-8d391d2c-ea85-47d4-a140-03ea6da1c101" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 736.218024] env[59857]: DEBUG nova.compute.manager [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 736.218168] env[59857]: DEBUG nova.compute.manager [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 736.218261] env[59857]: DEBUG nova.network.neutron [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 736.264717] env[59857]: DEBUG nova.network.neutron [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 736.279286] env[59857]: DEBUG nova.network.neutron [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 736.292652] env[59857]: INFO nova.compute.manager [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] Took 0.07 seconds to deallocate network for instance. [ 736.397885] env[59857]: INFO nova.scheduler.client.report [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Deleted allocations for instance 8d391d2c-ea85-47d4-a140-03ea6da1c101 [ 736.423324] env[59857]: DEBUG oslo_concurrency.lockutils [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "8d391d2c-ea85-47d4-a140-03ea6da1c101" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 11.835s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 736.423324] env[59857]: DEBUG nova.network.neutron [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 736.447816] env[59857]: DEBUG oslo_concurrency.lockutils [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Releasing lock "refresh_cache-4e0befc8-76e7-484d-957e-55b0aaedc2c6" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 736.447816] env[59857]: DEBUG nova.compute.manager [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 736.447816] env[59857]: DEBUG nova.compute.manager [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 736.447816] env[59857]: DEBUG nova.network.neutron [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 736.495123] env[59857]: DEBUG nova.network.neutron [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 736.512084] env[59857]: DEBUG nova.network.neutron [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 736.525020] env[59857]: INFO nova.compute.manager [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] Took 0.08 seconds to deallocate network for instance. [ 736.635184] env[59857]: INFO nova.scheduler.client.report [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Deleted allocations for instance 4e0befc8-76e7-484d-957e-55b0aaedc2c6 [ 736.659771] env[59857]: DEBUG oslo_concurrency.lockutils [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "4e0befc8-76e7-484d-957e-55b0aaedc2c6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 12.621s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 737.434683] env[59857]: ERROR nova.compute.manager [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 573439c9-df7f-4b26-8c67-091dcc6e41d9, please check neutron logs for more information. [ 737.434683] env[59857]: ERROR nova.compute.manager Traceback (most recent call last): [ 737.434683] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 737.434683] env[59857]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 737.434683] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 737.434683] env[59857]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 737.434683] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 737.434683] env[59857]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 737.434683] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 737.434683] env[59857]: ERROR nova.compute.manager self.force_reraise() [ 737.434683] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 737.434683] env[59857]: ERROR nova.compute.manager raise self.value [ 737.434683] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 737.434683] env[59857]: ERROR nova.compute.manager updated_port = self._update_port( [ 737.434683] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 737.434683] env[59857]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 737.435265] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 737.435265] env[59857]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 737.435265] env[59857]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 573439c9-df7f-4b26-8c67-091dcc6e41d9, please check neutron logs for more information. [ 737.435265] env[59857]: ERROR nova.compute.manager [ 737.435265] env[59857]: Traceback (most recent call last): [ 737.435265] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 737.435265] env[59857]: listener.cb(fileno) [ 737.435265] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 737.435265] env[59857]: result = function(*args, **kwargs) [ 737.435265] env[59857]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 737.435265] env[59857]: return func(*args, **kwargs) [ 737.435265] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 737.435265] env[59857]: raise e [ 737.435265] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 737.435265] env[59857]: nwinfo = self.network_api.allocate_for_instance( [ 737.435265] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 737.435265] env[59857]: created_port_ids = self._update_ports_for_instance( [ 737.435265] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 737.435265] env[59857]: with excutils.save_and_reraise_exception(): [ 737.435265] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 737.435265] env[59857]: self.force_reraise() [ 737.435265] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 737.435265] env[59857]: raise self.value [ 737.435265] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 737.435265] env[59857]: updated_port = self._update_port( [ 737.435265] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 737.435265] env[59857]: _ensure_no_port_binding_failure(port) [ 737.435265] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 737.435265] env[59857]: raise exception.PortBindingFailed(port_id=port['id']) [ 737.436077] env[59857]: nova.exception.PortBindingFailed: Binding failed for port 573439c9-df7f-4b26-8c67-091dcc6e41d9, please check neutron logs for more information. [ 737.436077] env[59857]: Removing descriptor: 15 [ 737.436077] env[59857]: ERROR nova.compute.manager [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 197488cc-ac6b-4561-8d57-f372c6493573] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 573439c9-df7f-4b26-8c67-091dcc6e41d9, please check neutron logs for more information. [ 737.436077] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] Traceback (most recent call last): [ 737.436077] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 737.436077] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] yield resources [ 737.436077] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 737.436077] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] self.driver.spawn(context, instance, image_meta, [ 737.436077] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 737.436077] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] self._vmops.spawn(context, instance, image_meta, injected_files, [ 737.436077] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 737.436077] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] vm_ref = self.build_virtual_machine(instance, [ 737.436494] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 737.436494] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] vif_infos = vmwarevif.get_vif_info(self._session, [ 737.436494] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 737.436494] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] for vif in network_info: [ 737.436494] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 737.436494] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] return self._sync_wrapper(fn, *args, **kwargs) [ 737.436494] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 737.436494] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] self.wait() [ 737.436494] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 737.436494] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] self[:] = self._gt.wait() [ 737.436494] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 737.436494] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] return self._exit_event.wait() [ 737.436494] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 737.436494] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] result = hub.switch() [ 737.436934] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 737.436934] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] return self.greenlet.switch() [ 737.436934] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 737.436934] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] result = function(*args, **kwargs) [ 737.436934] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 737.436934] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] return func(*args, **kwargs) [ 737.436934] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 737.436934] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] raise e [ 737.436934] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 737.436934] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] nwinfo = self.network_api.allocate_for_instance( [ 737.436934] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 737.436934] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] created_port_ids = self._update_ports_for_instance( [ 737.436934] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 737.437250] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] with excutils.save_and_reraise_exception(): [ 737.437250] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 737.437250] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] self.force_reraise() [ 737.437250] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 737.437250] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] raise self.value [ 737.437250] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 737.437250] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] updated_port = self._update_port( [ 737.437250] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 737.437250] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] _ensure_no_port_binding_failure(port) [ 737.437250] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 737.437250] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] raise exception.PortBindingFailed(port_id=port['id']) [ 737.437250] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] nova.exception.PortBindingFailed: Binding failed for port 573439c9-df7f-4b26-8c67-091dcc6e41d9, please check neutron logs for more information. [ 737.437250] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] [ 737.437562] env[59857]: INFO nova.compute.manager [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 197488cc-ac6b-4561-8d57-f372c6493573] Terminating instance [ 737.438361] env[59857]: DEBUG oslo_concurrency.lockutils [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Acquiring lock "refresh_cache-197488cc-ac6b-4561-8d57-f372c6493573" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 737.438534] env[59857]: DEBUG oslo_concurrency.lockutils [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Acquired lock "refresh_cache-197488cc-ac6b-4561-8d57-f372c6493573" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 737.438710] env[59857]: DEBUG nova.network.neutron [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 197488cc-ac6b-4561-8d57-f372c6493573] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 737.499767] env[59857]: DEBUG nova.network.neutron [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 197488cc-ac6b-4561-8d57-f372c6493573] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 737.974683] env[59857]: DEBUG nova.network.neutron [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 197488cc-ac6b-4561-8d57-f372c6493573] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 737.987877] env[59857]: DEBUG oslo_concurrency.lockutils [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Releasing lock "refresh_cache-197488cc-ac6b-4561-8d57-f372c6493573" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 737.988320] env[59857]: DEBUG nova.compute.manager [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 197488cc-ac6b-4561-8d57-f372c6493573] Start destroying the instance on the hypervisor. {{(pid=59857) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 737.988507] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 197488cc-ac6b-4561-8d57-f372c6493573] Destroying instance {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 737.989162] env[59857]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f1124f41-375e-4620-a96e-eee9b4fcc3b0 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.000524] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b68f376-eea3-48d6-9059-a53211739d75 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.024805] env[59857]: WARNING nova.virt.vmwareapi.vmops [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 197488cc-ac6b-4561-8d57-f372c6493573] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 197488cc-ac6b-4561-8d57-f372c6493573 could not be found. [ 738.024805] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 197488cc-ac6b-4561-8d57-f372c6493573] Instance destroyed {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 738.024967] env[59857]: INFO nova.compute.manager [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 197488cc-ac6b-4561-8d57-f372c6493573] Took 0.04 seconds to destroy the instance on the hypervisor. [ 738.025150] env[59857]: DEBUG oslo.service.loopingcall [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59857) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 738.026353] env[59857]: DEBUG nova.compute.manager [-] [instance: 197488cc-ac6b-4561-8d57-f372c6493573] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 738.026353] env[59857]: DEBUG nova.network.neutron [-] [instance: 197488cc-ac6b-4561-8d57-f372c6493573] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 738.071670] env[59857]: DEBUG nova.network.neutron [-] [instance: 197488cc-ac6b-4561-8d57-f372c6493573] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 738.082575] env[59857]: DEBUG nova.network.neutron [-] [instance: 197488cc-ac6b-4561-8d57-f372c6493573] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 738.095595] env[59857]: INFO nova.compute.manager [-] [instance: 197488cc-ac6b-4561-8d57-f372c6493573] Took 0.07 seconds to deallocate network for instance. [ 738.099635] env[59857]: DEBUG nova.compute.claims [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 197488cc-ac6b-4561-8d57-f372c6493573] Aborting claim: {{(pid=59857) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 738.099635] env[59857]: DEBUG oslo_concurrency.lockutils [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 738.099635] env[59857]: DEBUG oslo_concurrency.lockutils [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 738.238409] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-745f5110-c380-42fe-8cf2-e0f3326800c9 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.247325] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d697e51-7709-455c-92f0-14ea5fffb5c3 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.280295] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-854065cd-d604-40d8-8bbf-5b8f8f9a8c46 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.288131] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12b73e9c-7afd-4c3f-8047-1f661958dc34 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.303234] env[59857]: DEBUG nova.compute.provider_tree [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 738.310708] env[59857]: DEBUG nova.scheduler.client.report [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 738.325160] env[59857]: DEBUG oslo_concurrency.lockutils [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.227s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 738.325782] env[59857]: ERROR nova.compute.manager [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 197488cc-ac6b-4561-8d57-f372c6493573] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 573439c9-df7f-4b26-8c67-091dcc6e41d9, please check neutron logs for more information. [ 738.325782] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] Traceback (most recent call last): [ 738.325782] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 738.325782] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] self.driver.spawn(context, instance, image_meta, [ 738.325782] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 738.325782] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] self._vmops.spawn(context, instance, image_meta, injected_files, [ 738.325782] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 738.325782] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] vm_ref = self.build_virtual_machine(instance, [ 738.325782] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 738.325782] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] vif_infos = vmwarevif.get_vif_info(self._session, [ 738.325782] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 738.326129] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] for vif in network_info: [ 738.326129] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 738.326129] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] return self._sync_wrapper(fn, *args, **kwargs) [ 738.326129] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 738.326129] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] self.wait() [ 738.326129] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 738.326129] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] self[:] = self._gt.wait() [ 738.326129] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 738.326129] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] return self._exit_event.wait() [ 738.326129] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 738.326129] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] result = hub.switch() [ 738.326129] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 738.326129] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] return self.greenlet.switch() [ 738.326129] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 738.326494] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] result = function(*args, **kwargs) [ 738.326494] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 738.326494] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] return func(*args, **kwargs) [ 738.326494] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 738.326494] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] raise e [ 738.326494] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 738.326494] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] nwinfo = self.network_api.allocate_for_instance( [ 738.326494] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 738.326494] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] created_port_ids = self._update_ports_for_instance( [ 738.326494] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 738.326494] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] with excutils.save_and_reraise_exception(): [ 738.326494] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 738.326494] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] self.force_reraise() [ 738.326838] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 738.326838] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] raise self.value [ 738.326838] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 738.326838] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] updated_port = self._update_port( [ 738.326838] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 738.326838] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] _ensure_no_port_binding_failure(port) [ 738.326838] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 738.326838] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] raise exception.PortBindingFailed(port_id=port['id']) [ 738.326838] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] nova.exception.PortBindingFailed: Binding failed for port 573439c9-df7f-4b26-8c67-091dcc6e41d9, please check neutron logs for more information. [ 738.326838] env[59857]: ERROR nova.compute.manager [instance: 197488cc-ac6b-4561-8d57-f372c6493573] [ 738.326838] env[59857]: DEBUG nova.compute.utils [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 197488cc-ac6b-4561-8d57-f372c6493573] Binding failed for port 573439c9-df7f-4b26-8c67-091dcc6e41d9, please check neutron logs for more information. {{(pid=59857) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 738.328184] env[59857]: DEBUG nova.compute.manager [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 197488cc-ac6b-4561-8d57-f372c6493573] Build of instance 197488cc-ac6b-4561-8d57-f372c6493573 was re-scheduled: Binding failed for port 573439c9-df7f-4b26-8c67-091dcc6e41d9, please check neutron logs for more information. {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 738.328623] env[59857]: DEBUG nova.compute.manager [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 197488cc-ac6b-4561-8d57-f372c6493573] Unplugging VIFs for instance {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 738.328848] env[59857]: DEBUG oslo_concurrency.lockutils [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Acquiring lock "refresh_cache-197488cc-ac6b-4561-8d57-f372c6493573" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 738.328990] env[59857]: DEBUG oslo_concurrency.lockutils [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Acquired lock "refresh_cache-197488cc-ac6b-4561-8d57-f372c6493573" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 738.329160] env[59857]: DEBUG nova.network.neutron [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 197488cc-ac6b-4561-8d57-f372c6493573] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 738.437019] env[59857]: DEBUG nova.network.neutron [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 197488cc-ac6b-4561-8d57-f372c6493573] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 738.907770] env[59857]: DEBUG nova.network.neutron [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 197488cc-ac6b-4561-8d57-f372c6493573] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 738.923755] env[59857]: DEBUG oslo_concurrency.lockutils [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Releasing lock "refresh_cache-197488cc-ac6b-4561-8d57-f372c6493573" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 738.923996] env[59857]: DEBUG nova.compute.manager [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 738.924193] env[59857]: DEBUG nova.compute.manager [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 197488cc-ac6b-4561-8d57-f372c6493573] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 738.924361] env[59857]: DEBUG nova.network.neutron [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 197488cc-ac6b-4561-8d57-f372c6493573] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 738.986409] env[59857]: DEBUG nova.network.neutron [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 197488cc-ac6b-4561-8d57-f372c6493573] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 738.995445] env[59857]: DEBUG nova.network.neutron [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 197488cc-ac6b-4561-8d57-f372c6493573] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 739.015084] env[59857]: INFO nova.compute.manager [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 197488cc-ac6b-4561-8d57-f372c6493573] Took 0.09 seconds to deallocate network for instance. [ 739.134940] env[59857]: INFO nova.scheduler.client.report [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Deleted allocations for instance 197488cc-ac6b-4561-8d57-f372c6493573 [ 739.160423] env[59857]: DEBUG oslo_concurrency.lockutils [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Lock "197488cc-ac6b-4561-8d57-f372c6493573" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 10.089s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 739.225959] env[59857]: DEBUG oslo_concurrency.lockutils [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Acquiring lock "ac74db4e-ee8d-4aab-96bc-b41bc30d371b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 739.226206] env[59857]: DEBUG oslo_concurrency.lockutils [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Lock "ac74db4e-ee8d-4aab-96bc-b41bc30d371b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 739.253684] env[59857]: DEBUG nova.compute.manager [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] Starting instance... {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 739.310197] env[59857]: DEBUG oslo_concurrency.lockutils [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 739.310197] env[59857]: DEBUG oslo_concurrency.lockutils [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 739.311886] env[59857]: INFO nova.compute.claims [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 739.469314] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-013426fa-06c5-4795-acc7-c791e03321c8 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.477156] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03b1928d-9c8b-4348-90d5-98bb455a1d69 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.513328] env[59857]: ERROR nova.compute.manager [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 33db9413-9863-4b89-8ca0-4838adac1c47, please check neutron logs for more information. [ 739.513328] env[59857]: ERROR nova.compute.manager Traceback (most recent call last): [ 739.513328] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 739.513328] env[59857]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 739.513328] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 739.513328] env[59857]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 739.513328] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 739.513328] env[59857]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 739.513328] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 739.513328] env[59857]: ERROR nova.compute.manager self.force_reraise() [ 739.513328] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 739.513328] env[59857]: ERROR nova.compute.manager raise self.value [ 739.513328] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 739.513328] env[59857]: ERROR nova.compute.manager updated_port = self._update_port( [ 739.513328] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 739.513328] env[59857]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 739.513940] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 739.513940] env[59857]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 739.513940] env[59857]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 33db9413-9863-4b89-8ca0-4838adac1c47, please check neutron logs for more information. [ 739.513940] env[59857]: ERROR nova.compute.manager [ 739.513940] env[59857]: Traceback (most recent call last): [ 739.513940] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 739.513940] env[59857]: listener.cb(fileno) [ 739.513940] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 739.513940] env[59857]: result = function(*args, **kwargs) [ 739.513940] env[59857]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 739.513940] env[59857]: return func(*args, **kwargs) [ 739.513940] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 739.513940] env[59857]: raise e [ 739.513940] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 739.513940] env[59857]: nwinfo = self.network_api.allocate_for_instance( [ 739.513940] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 739.513940] env[59857]: created_port_ids = self._update_ports_for_instance( [ 739.513940] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 739.513940] env[59857]: with excutils.save_and_reraise_exception(): [ 739.513940] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 739.513940] env[59857]: self.force_reraise() [ 739.513940] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 739.513940] env[59857]: raise self.value [ 739.513940] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 739.513940] env[59857]: updated_port = self._update_port( [ 739.513940] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 739.513940] env[59857]: _ensure_no_port_binding_failure(port) [ 739.513940] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 739.513940] env[59857]: raise exception.PortBindingFailed(port_id=port['id']) [ 739.514872] env[59857]: nova.exception.PortBindingFailed: Binding failed for port 33db9413-9863-4b89-8ca0-4838adac1c47, please check neutron logs for more information. [ 739.514872] env[59857]: Removing descriptor: 17 [ 739.515487] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-050c28f7-ed20-4af3-ace5-8a1b27f115ab {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.518806] env[59857]: ERROR nova.compute.manager [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 33db9413-9863-4b89-8ca0-4838adac1c47, please check neutron logs for more information. [ 739.518806] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] Traceback (most recent call last): [ 739.518806] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 739.518806] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] yield resources [ 739.518806] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 739.518806] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] self.driver.spawn(context, instance, image_meta, [ 739.518806] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 739.518806] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] self._vmops.spawn(context, instance, image_meta, injected_files, [ 739.518806] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 739.518806] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] vm_ref = self.build_virtual_machine(instance, [ 739.518806] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 739.519227] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] vif_infos = vmwarevif.get_vif_info(self._session, [ 739.519227] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 739.519227] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] for vif in network_info: [ 739.519227] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 739.519227] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] return self._sync_wrapper(fn, *args, **kwargs) [ 739.519227] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 739.519227] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] self.wait() [ 739.519227] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 739.519227] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] self[:] = self._gt.wait() [ 739.519227] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 739.519227] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] return self._exit_event.wait() [ 739.519227] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 739.519227] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] result = hub.switch() [ 739.519227] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 739.519602] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] return self.greenlet.switch() [ 739.519602] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 739.519602] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] result = function(*args, **kwargs) [ 739.519602] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 739.519602] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] return func(*args, **kwargs) [ 739.519602] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 739.519602] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] raise e [ 739.519602] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 739.519602] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] nwinfo = self.network_api.allocate_for_instance( [ 739.519602] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 739.519602] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] created_port_ids = self._update_ports_for_instance( [ 739.519602] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 739.519602] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] with excutils.save_and_reraise_exception(): [ 739.519972] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 739.519972] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] self.force_reraise() [ 739.519972] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 739.519972] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] raise self.value [ 739.519972] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 739.519972] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] updated_port = self._update_port( [ 739.519972] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 739.519972] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] _ensure_no_port_binding_failure(port) [ 739.519972] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 739.519972] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] raise exception.PortBindingFailed(port_id=port['id']) [ 739.519972] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] nova.exception.PortBindingFailed: Binding failed for port 33db9413-9863-4b89-8ca0-4838adac1c47, please check neutron logs for more information. [ 739.519972] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] [ 739.520315] env[59857]: INFO nova.compute.manager [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] Terminating instance [ 739.521148] env[59857]: DEBUG oslo_concurrency.lockutils [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Acquiring lock "refresh_cache-949f98a2-9316-4cbd-b1e3-b05d08a68997" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 739.521302] env[59857]: DEBUG oslo_concurrency.lockutils [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Acquired lock "refresh_cache-949f98a2-9316-4cbd-b1e3-b05d08a68997" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 739.521454] env[59857]: DEBUG nova.network.neutron [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 739.525632] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b314ff6c-d2ad-49f9-8e62-a9dbfd03eca9 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.543515] env[59857]: DEBUG nova.compute.provider_tree [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 739.553846] env[59857]: DEBUG nova.scheduler.client.report [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 739.568272] env[59857]: DEBUG oslo_concurrency.lockutils [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.258s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 739.569011] env[59857]: DEBUG nova.compute.manager [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] Start building networks asynchronously for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 739.600414] env[59857]: DEBUG nova.compute.utils [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Using /dev/sd instead of None {{(pid=59857) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 739.601943] env[59857]: DEBUG nova.compute.manager [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] Allocating IP information in the background. {{(pid=59857) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 739.602155] env[59857]: DEBUG nova.network.neutron [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] allocate_for_instance() {{(pid=59857) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 739.605280] env[59857]: DEBUG nova.network.neutron [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 739.610770] env[59857]: DEBUG nova.compute.manager [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] Start building block device mappings for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 739.687141] env[59857]: DEBUG nova.compute.manager [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] Start spawning the instance on the hypervisor. {{(pid=59857) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 739.709141] env[59857]: DEBUG nova.virt.hardware [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-23T14:43:14Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-23T14:42:57Z,direct_url=,disk_format='vmdk',id=4a4a4830-1ff7-4cff-ab75-d665942f46b5,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='88755ffc9d394d14928f0692771e61f5',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-23T14:42:58Z,virtual_size=,visibility=), allow threads: False {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 739.709473] env[59857]: DEBUG nova.virt.hardware [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Flavor limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 739.709535] env[59857]: DEBUG nova.virt.hardware [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Image limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 739.709686] env[59857]: DEBUG nova.virt.hardware [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Flavor pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 739.709827] env[59857]: DEBUG nova.virt.hardware [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Image pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 739.709968] env[59857]: DEBUG nova.virt.hardware [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 739.710280] env[59857]: DEBUG nova.virt.hardware [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 739.710448] env[59857]: DEBUG nova.virt.hardware [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 739.710614] env[59857]: DEBUG nova.virt.hardware [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Got 1 possible topologies {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 739.710773] env[59857]: DEBUG nova.virt.hardware [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 739.710939] env[59857]: DEBUG nova.virt.hardware [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 739.712073] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-626620a8-4daf-4383-83d1-25d03a34b79a {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.721023] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93b6c6e2-deae-4996-a7fc-79a8264e9ab8 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.767990] env[59857]: DEBUG nova.policy [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd07a8b6181ab41348233feb85133e0a4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '51cca17e80d947b495de9f644e67bb98', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59857) authorize /opt/stack/nova/nova/policy.py:203}} [ 740.298923] env[59857]: DEBUG nova.compute.manager [req-db0c6873-dc7e-4889-87a1-9a3bdf6d3ecf req-29c3b2b1-8495-41d3-a2c6-afc2c6c4f44e service nova] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] Received event network-changed-33db9413-9863-4b89-8ca0-4838adac1c47 {{(pid=59857) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 740.299140] env[59857]: DEBUG nova.compute.manager [req-db0c6873-dc7e-4889-87a1-9a3bdf6d3ecf req-29c3b2b1-8495-41d3-a2c6-afc2c6c4f44e service nova] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] Refreshing instance network info cache due to event network-changed-33db9413-9863-4b89-8ca0-4838adac1c47. {{(pid=59857) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 740.299326] env[59857]: DEBUG oslo_concurrency.lockutils [req-db0c6873-dc7e-4889-87a1-9a3bdf6d3ecf req-29c3b2b1-8495-41d3-a2c6-afc2c6c4f44e service nova] Acquiring lock "refresh_cache-949f98a2-9316-4cbd-b1e3-b05d08a68997" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 740.333548] env[59857]: DEBUG nova.network.neutron [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 740.345296] env[59857]: DEBUG oslo_concurrency.lockutils [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Releasing lock "refresh_cache-949f98a2-9316-4cbd-b1e3-b05d08a68997" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 740.345819] env[59857]: DEBUG nova.compute.manager [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] Start destroying the instance on the hypervisor. {{(pid=59857) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 740.345819] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] Destroying instance {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 740.346077] env[59857]: DEBUG oslo_concurrency.lockutils [req-db0c6873-dc7e-4889-87a1-9a3bdf6d3ecf req-29c3b2b1-8495-41d3-a2c6-afc2c6c4f44e service nova] Acquired lock "refresh_cache-949f98a2-9316-4cbd-b1e3-b05d08a68997" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 740.346240] env[59857]: DEBUG nova.network.neutron [req-db0c6873-dc7e-4889-87a1-9a3bdf6d3ecf req-29c3b2b1-8495-41d3-a2c6-afc2c6c4f44e service nova] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] Refreshing network info cache for port 33db9413-9863-4b89-8ca0-4838adac1c47 {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 740.351036] env[59857]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-8ec0b0ec-6656-419f-9b01-3e98d03a3e05 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 740.361849] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e698ebbd-c4f0-4768-ac0a-bebcea061526 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 740.392174] env[59857]: WARNING nova.virt.vmwareapi.vmops [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 949f98a2-9316-4cbd-b1e3-b05d08a68997 could not be found. [ 740.392174] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] Instance destroyed {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 740.392481] env[59857]: INFO nova.compute.manager [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] Took 0.05 seconds to destroy the instance on the hypervisor. [ 740.392923] env[59857]: DEBUG oslo.service.loopingcall [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59857) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 740.393352] env[59857]: DEBUG nova.compute.manager [-] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 740.393559] env[59857]: DEBUG nova.network.neutron [-] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 740.441260] env[59857]: DEBUG nova.network.neutron [req-db0c6873-dc7e-4889-87a1-9a3bdf6d3ecf req-29c3b2b1-8495-41d3-a2c6-afc2c6c4f44e service nova] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 740.521320] env[59857]: DEBUG nova.network.neutron [-] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 740.530288] env[59857]: DEBUG nova.network.neutron [-] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 740.539997] env[59857]: INFO nova.compute.manager [-] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] Took 0.15 seconds to deallocate network for instance. [ 740.543812] env[59857]: DEBUG nova.compute.claims [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] Aborting claim: {{(pid=59857) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 740.543812] env[59857]: DEBUG oslo_concurrency.lockutils [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 740.543812] env[59857]: DEBUG oslo_concurrency.lockutils [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 740.677748] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ce2088f-7e36-45b8-b70f-eb5f1452b906 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 740.685978] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c75ec97e-f684-451e-9d26-26a2b6e12010 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 740.720078] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bcf488ce-258b-4347-be6c-18c1c1b63435 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 740.730523] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-172f3256-95fb-4dcb-b2d2-321b7850a0a5 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 740.746376] env[59857]: DEBUG nova.compute.provider_tree [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 740.761821] env[59857]: DEBUG nova.scheduler.client.report [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 740.777992] env[59857]: DEBUG oslo_concurrency.lockutils [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.235s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 740.778653] env[59857]: ERROR nova.compute.manager [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 33db9413-9863-4b89-8ca0-4838adac1c47, please check neutron logs for more information. [ 740.778653] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] Traceback (most recent call last): [ 740.778653] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 740.778653] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] self.driver.spawn(context, instance, image_meta, [ 740.778653] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 740.778653] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] self._vmops.spawn(context, instance, image_meta, injected_files, [ 740.778653] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 740.778653] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] vm_ref = self.build_virtual_machine(instance, [ 740.778653] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 740.778653] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] vif_infos = vmwarevif.get_vif_info(self._session, [ 740.778653] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 740.778990] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] for vif in network_info: [ 740.778990] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 740.778990] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] return self._sync_wrapper(fn, *args, **kwargs) [ 740.778990] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 740.778990] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] self.wait() [ 740.778990] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 740.778990] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] self[:] = self._gt.wait() [ 740.778990] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 740.778990] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] return self._exit_event.wait() [ 740.778990] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 740.778990] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] result = hub.switch() [ 740.778990] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 740.778990] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] return self.greenlet.switch() [ 740.778990] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 740.779366] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] result = function(*args, **kwargs) [ 740.779366] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 740.779366] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] return func(*args, **kwargs) [ 740.779366] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 740.779366] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] raise e [ 740.779366] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 740.779366] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] nwinfo = self.network_api.allocate_for_instance( [ 740.779366] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 740.779366] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] created_port_ids = self._update_ports_for_instance( [ 740.779366] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 740.779366] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] with excutils.save_and_reraise_exception(): [ 740.779366] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 740.779366] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] self.force_reraise() [ 740.779660] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 740.779660] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] raise self.value [ 740.779660] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 740.779660] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] updated_port = self._update_port( [ 740.779660] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 740.779660] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] _ensure_no_port_binding_failure(port) [ 740.779660] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 740.779660] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] raise exception.PortBindingFailed(port_id=port['id']) [ 740.779660] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] nova.exception.PortBindingFailed: Binding failed for port 33db9413-9863-4b89-8ca0-4838adac1c47, please check neutron logs for more information. [ 740.779660] env[59857]: ERROR nova.compute.manager [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] [ 740.779660] env[59857]: DEBUG nova.compute.utils [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] Binding failed for port 33db9413-9863-4b89-8ca0-4838adac1c47, please check neutron logs for more information. {{(pid=59857) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 740.784521] env[59857]: DEBUG nova.compute.manager [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] Build of instance 949f98a2-9316-4cbd-b1e3-b05d08a68997 was re-scheduled: Binding failed for port 33db9413-9863-4b89-8ca0-4838adac1c47, please check neutron logs for more information. {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 740.784521] env[59857]: DEBUG nova.compute.manager [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] Unplugging VIFs for instance {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 740.784521] env[59857]: DEBUG oslo_concurrency.lockutils [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Acquiring lock "refresh_cache-949f98a2-9316-4cbd-b1e3-b05d08a68997" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 740.889919] env[59857]: DEBUG nova.network.neutron [req-db0c6873-dc7e-4889-87a1-9a3bdf6d3ecf req-29c3b2b1-8495-41d3-a2c6-afc2c6c4f44e service nova] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 740.909837] env[59857]: DEBUG oslo_concurrency.lockutils [req-db0c6873-dc7e-4889-87a1-9a3bdf6d3ecf req-29c3b2b1-8495-41d3-a2c6-afc2c6c4f44e service nova] Releasing lock "refresh_cache-949f98a2-9316-4cbd-b1e3-b05d08a68997" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 740.910208] env[59857]: DEBUG oslo_concurrency.lockutils [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Acquired lock "refresh_cache-949f98a2-9316-4cbd-b1e3-b05d08a68997" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 740.910389] env[59857]: DEBUG nova.network.neutron [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 740.967122] env[59857]: DEBUG nova.network.neutron [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 741.098967] env[59857]: DEBUG oslo_concurrency.lockutils [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Acquiring lock "963513ec-2280-475b-87a0-045df892e8b4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 741.099221] env[59857]: DEBUG oslo_concurrency.lockutils [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Lock "963513ec-2280-475b-87a0-045df892e8b4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 741.114023] env[59857]: DEBUG nova.compute.manager [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] [instance: 963513ec-2280-475b-87a0-045df892e8b4] Starting instance... {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 741.172460] env[59857]: DEBUG oslo_concurrency.lockutils [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 741.172735] env[59857]: DEBUG oslo_concurrency.lockutils [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 741.174190] env[59857]: INFO nova.compute.claims [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] [instance: 963513ec-2280-475b-87a0-045df892e8b4] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 741.350904] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d19399d4-9e82-4547-ae0e-5d761e511ce5 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 741.358821] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f345e02c-c220-4ed8-85d2-2010bc888f14 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 741.407027] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d339d3ed-4824-46e9-b3ba-e1d67417f28b {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 741.415752] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aae63000-a792-444a-aca8-ca7e816b1ebc {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 741.434563] env[59857]: DEBUG nova.compute.provider_tree [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 741.452025] env[59857]: DEBUG nova.scheduler.client.report [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 741.470600] env[59857]: DEBUG oslo_concurrency.lockutils [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.298s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 741.472426] env[59857]: DEBUG nova.compute.manager [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] [instance: 963513ec-2280-475b-87a0-045df892e8b4] Start building networks asynchronously for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 741.523661] env[59857]: DEBUG nova.compute.utils [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Using /dev/sd instead of None {{(pid=59857) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 741.526013] env[59857]: DEBUG nova.compute.manager [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] [instance: 963513ec-2280-475b-87a0-045df892e8b4] Allocating IP information in the background. {{(pid=59857) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 741.526013] env[59857]: DEBUG nova.network.neutron [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] [instance: 963513ec-2280-475b-87a0-045df892e8b4] allocate_for_instance() {{(pid=59857) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 741.538577] env[59857]: DEBUG nova.compute.manager [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] [instance: 963513ec-2280-475b-87a0-045df892e8b4] Start building block device mappings for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 741.618243] env[59857]: DEBUG nova.compute.manager [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] [instance: 963513ec-2280-475b-87a0-045df892e8b4] Start spawning the instance on the hypervisor. {{(pid=59857) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 741.646377] env[59857]: DEBUG nova.virt.hardware [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-23T14:43:14Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-23T14:42:57Z,direct_url=,disk_format='vmdk',id=4a4a4830-1ff7-4cff-ab75-d665942f46b5,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='88755ffc9d394d14928f0692771e61f5',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-23T14:42:58Z,virtual_size=,visibility=), allow threads: False {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 741.646738] env[59857]: DEBUG nova.virt.hardware [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Flavor limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 741.647479] env[59857]: DEBUG nova.virt.hardware [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Image limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 741.647592] env[59857]: DEBUG nova.virt.hardware [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Flavor pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 741.652818] env[59857]: DEBUG nova.virt.hardware [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Image pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 741.652818] env[59857]: DEBUG nova.virt.hardware [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 741.652818] env[59857]: DEBUG nova.virt.hardware [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 741.652818] env[59857]: DEBUG nova.virt.hardware [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 741.652818] env[59857]: DEBUG nova.virt.hardware [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Got 1 possible topologies {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 741.653029] env[59857]: DEBUG nova.virt.hardware [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 741.653029] env[59857]: DEBUG nova.virt.hardware [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 741.654837] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aaec2c74-4c4e-4740-badc-616de5162970 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 741.664951] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7bc2672b-ead6-4ae3-a451-786e12144cdf {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 741.685381] env[59857]: DEBUG nova.network.neutron [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 741.697428] env[59857]: DEBUG oslo_concurrency.lockutils [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Releasing lock "refresh_cache-949f98a2-9316-4cbd-b1e3-b05d08a68997" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 741.697428] env[59857]: DEBUG nova.compute.manager [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 741.697593] env[59857]: DEBUG nova.compute.manager [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 741.697803] env[59857]: DEBUG nova.network.neutron [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 741.829137] env[59857]: DEBUG nova.network.neutron [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 741.839995] env[59857]: DEBUG nova.network.neutron [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 741.851455] env[59857]: INFO nova.compute.manager [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] Took 0.15 seconds to deallocate network for instance. [ 741.879203] env[59857]: DEBUG nova.network.neutron [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] Successfully created port: 02752945-ef11-45a3-8c97-693f994af658 {{(pid=59857) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 741.949093] env[59857]: DEBUG nova.policy [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dd08df6035734ed594e1a61adc83f5a1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6dda5a654e88441fa1c1f01d1435fda8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59857) authorize /opt/stack/nova/nova/policy.py:203}} [ 741.960556] env[59857]: INFO nova.scheduler.client.report [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Deleted allocations for instance 949f98a2-9316-4cbd-b1e3-b05d08a68997 [ 741.984473] env[59857]: DEBUG oslo_concurrency.lockutils [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Lock "949f98a2-9316-4cbd-b1e3-b05d08a68997" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 14.727s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 743.369137] env[59857]: DEBUG oslo_concurrency.lockutils [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Acquiring lock "f12524a7-21b9-4e35-b15b-955627d58c7a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 743.369137] env[59857]: DEBUG oslo_concurrency.lockutils [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "f12524a7-21b9-4e35-b15b-955627d58c7a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 743.391117] env[59857]: DEBUG oslo_concurrency.lockutils [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 743.391117] env[59857]: DEBUG oslo_concurrency.lockutils [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 743.392194] env[59857]: DEBUG nova.compute.manager [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] Starting instance... {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 743.407255] env[59857]: DEBUG nova.compute.manager [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d] Starting instance... {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 743.462321] env[59857]: DEBUG oslo_concurrency.lockutils [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 743.466026] env[59857]: DEBUG oslo_concurrency.lockutils [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 743.470117] env[59857]: INFO nova.compute.claims [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 743.476093] env[59857]: DEBUG oslo_concurrency.lockutils [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 743.666848] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78551e9f-8ee2-4040-b87b-aac0d02cce15 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.674754] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d97b8178-d127-4bc8-9601-c454d1d3b11f {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.704661] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3db862ae-9265-4063-ace4-70e66775d1a4 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.712349] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f7778d7-b200-4bfb-8049-bc5fde176782 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.726915] env[59857]: DEBUG nova.compute.provider_tree [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 743.735648] env[59857]: DEBUG nova.scheduler.client.report [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 743.754875] env[59857]: DEBUG oslo_concurrency.lockutils [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.292s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 743.755431] env[59857]: DEBUG nova.compute.manager [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] Start building networks asynchronously for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 743.758691] env[59857]: DEBUG oslo_concurrency.lockutils [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.283s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 743.760445] env[59857]: INFO nova.compute.claims [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 743.811139] env[59857]: DEBUG nova.compute.utils [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Using /dev/sd instead of None {{(pid=59857) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 743.812488] env[59857]: DEBUG nova.compute.manager [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] Allocating IP information in the background. {{(pid=59857) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 743.815211] env[59857]: DEBUG nova.network.neutron [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] allocate_for_instance() {{(pid=59857) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 743.826352] env[59857]: DEBUG nova.compute.manager [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] Start building block device mappings for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 743.923670] env[59857]: DEBUG nova.compute.manager [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] Start spawning the instance on the hypervisor. {{(pid=59857) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 743.955162] env[59857]: DEBUG nova.virt.hardware [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-23T14:43:14Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-23T14:42:57Z,direct_url=,disk_format='vmdk',id=4a4a4830-1ff7-4cff-ab75-d665942f46b5,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='88755ffc9d394d14928f0692771e61f5',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-23T14:42:58Z,virtual_size=,visibility=), allow threads: False {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 743.955394] env[59857]: DEBUG nova.virt.hardware [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Flavor limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 743.955574] env[59857]: DEBUG nova.virt.hardware [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Image limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 743.956025] env[59857]: DEBUG nova.virt.hardware [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Flavor pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 743.956025] env[59857]: DEBUG nova.virt.hardware [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Image pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 743.956025] env[59857]: DEBUG nova.virt.hardware [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 743.956159] env[59857]: DEBUG nova.virt.hardware [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 743.956310] env[59857]: DEBUG nova.virt.hardware [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 743.956467] env[59857]: DEBUG nova.virt.hardware [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Got 1 possible topologies {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 743.956660] env[59857]: DEBUG nova.virt.hardware [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 743.956839] env[59857]: DEBUG nova.virt.hardware [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 743.957707] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef95bdc3-4a2e-4966-a767-95897240020e {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.966320] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f147525-c6ec-4f14-b5d6-1c2dbfcc9983 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.970352] env[59857]: DEBUG nova.policy [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'baef766334764dd9ab481d3a2aacd07b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a9e33b2e4b8c439a8e8a557ddda22fce', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59857) authorize /opt/stack/nova/nova/policy.py:203}} [ 743.975668] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25815b07-433f-4119-bb28-2017232be4e9 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.982494] env[59857]: DEBUG oslo_concurrency.lockutils [None req-2ce8ad72-7513-4a44-b900-0fb4886cac5b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 743.984268] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e790b9b0-2117-45fc-a718-133d3ea154fc {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.024832] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7daca120-44d7-43f5-807c-87e81f3caa75 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.032390] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f8230cc-b8ed-4d3e-bdfd-f367f08c7121 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.047939] env[59857]: DEBUG nova.compute.provider_tree [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 744.058393] env[59857]: DEBUG nova.scheduler.client.report [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 744.075332] env[59857]: DEBUG oslo_concurrency.lockutils [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.317s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 744.075826] env[59857]: DEBUG nova.compute.manager [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d] Start building networks asynchronously for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 744.121267] env[59857]: DEBUG nova.compute.claims [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d] Aborting claim: {{(pid=59857) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 744.121460] env[59857]: DEBUG oslo_concurrency.lockutils [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 744.121739] env[59857]: DEBUG oslo_concurrency.lockutils [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 744.314542] env[59857]: DEBUG nova.network.neutron [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] [instance: 963513ec-2280-475b-87a0-045df892e8b4] Successfully created port: 64249b93-1271-4bd9-b7ca-deae2d68e0fd {{(pid=59857) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 744.318479] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e19c15e2-1501-4a74-9b85-0820ae934aef {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.328735] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c450d61e-ff0f-466c-8280-a77ef5332cb5 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.361500] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a313216e-f7ab-4c3a-9e00-8f4a4328c183 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.369526] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2468bb47-e30c-4c8a-8035-6b1172dcf68c {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.384475] env[59857]: DEBUG nova.compute.provider_tree [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 744.394572] env[59857]: DEBUG nova.scheduler.client.report [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 744.408876] env[59857]: DEBUG oslo_concurrency.lockutils [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.287s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 744.409660] env[59857]: DEBUG nova.compute.utils [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d] Conflict updating instance bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d. Expected: {'task_state': [None]}. Actual: {'task_state': 'deleting'} {{(pid=59857) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 744.411116] env[59857]: DEBUG nova.compute.manager [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d] Instance disappeared during build. {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 744.411348] env[59857]: DEBUG nova.compute.manager [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d] Unplugging VIFs for instance {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 744.411617] env[59857]: DEBUG oslo_concurrency.lockutils [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "refresh_cache-bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 744.411814] env[59857]: DEBUG oslo_concurrency.lockutils [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquired lock "refresh_cache-bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 744.412096] env[59857]: DEBUG nova.network.neutron [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 744.482355] env[59857]: DEBUG nova.network.neutron [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 745.208371] env[59857]: DEBUG nova.network.neutron [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 745.220625] env[59857]: DEBUG oslo_concurrency.lockutils [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Releasing lock "refresh_cache-bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 745.220625] env[59857]: DEBUG nova.compute.manager [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 745.220625] env[59857]: DEBUG nova.compute.manager [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 745.220625] env[59857]: DEBUG nova.network.neutron [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 745.289112] env[59857]: DEBUG nova.network.neutron [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 745.302848] env[59857]: DEBUG nova.network.neutron [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 745.317369] env[59857]: INFO nova.compute.manager [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d] Took 0.10 seconds to deallocate network for instance. [ 745.418164] env[59857]: INFO nova.scheduler.client.report [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Deleted allocations for instance bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d [ 745.419043] env[59857]: DEBUG oslo_concurrency.lockutils [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 2.029s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 745.419043] env[59857]: DEBUG oslo_concurrency.lockutils [None req-2ce8ad72-7513-4a44-b900-0fb4886cac5b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 1.436s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 745.419043] env[59857]: DEBUG oslo_concurrency.lockutils [None req-2ce8ad72-7513-4a44-b900-0fb4886cac5b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 745.419259] env[59857]: DEBUG oslo_concurrency.lockutils [None req-2ce8ad72-7513-4a44-b900-0fb4886cac5b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 745.419259] env[59857]: DEBUG oslo_concurrency.lockutils [None req-2ce8ad72-7513-4a44-b900-0fb4886cac5b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 745.421835] env[59857]: INFO nova.compute.manager [None req-2ce8ad72-7513-4a44-b900-0fb4886cac5b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d] Terminating instance [ 745.425650] env[59857]: DEBUG oslo_concurrency.lockutils [None req-2ce8ad72-7513-4a44-b900-0fb4886cac5b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "refresh_cache-bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 745.425801] env[59857]: DEBUG oslo_concurrency.lockutils [None req-2ce8ad72-7513-4a44-b900-0fb4886cac5b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquired lock "refresh_cache-bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 745.425962] env[59857]: DEBUG nova.network.neutron [None req-2ce8ad72-7513-4a44-b900-0fb4886cac5b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 745.536256] env[59857]: DEBUG nova.network.neutron [None req-2ce8ad72-7513-4a44-b900-0fb4886cac5b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 746.133405] env[59857]: DEBUG nova.network.neutron [None req-2ce8ad72-7513-4a44-b900-0fb4886cac5b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 746.169599] env[59857]: DEBUG oslo_concurrency.lockutils [None req-2ce8ad72-7513-4a44-b900-0fb4886cac5b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Releasing lock "refresh_cache-bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 746.170683] env[59857]: DEBUG nova.compute.manager [None req-2ce8ad72-7513-4a44-b900-0fb4886cac5b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d] Start destroying the instance on the hypervisor. {{(pid=59857) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 746.170683] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-2ce8ad72-7513-4a44-b900-0fb4886cac5b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d] Destroying instance {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 746.171121] env[59857]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-82532422-6fb1-46e6-93d5-f308343962c5 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 746.180908] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c10c288-7351-4f0d-b9b7-33e57f797a9b {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 746.211665] env[59857]: WARNING nova.virt.vmwareapi.vmops [None req-2ce8ad72-7513-4a44-b900-0fb4886cac5b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d could not be found. [ 746.211959] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-2ce8ad72-7513-4a44-b900-0fb4886cac5b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d] Instance destroyed {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 746.212186] env[59857]: INFO nova.compute.manager [None req-2ce8ad72-7513-4a44-b900-0fb4886cac5b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d] Took 0.04 seconds to destroy the instance on the hypervisor. [ 746.212424] env[59857]: DEBUG oslo.service.loopingcall [None req-2ce8ad72-7513-4a44-b900-0fb4886cac5b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59857) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 746.212950] env[59857]: DEBUG nova.compute.manager [-] [instance: bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 746.213219] env[59857]: DEBUG nova.network.neutron [-] [instance: bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 746.268119] env[59857]: DEBUG nova.network.neutron [-] [instance: bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 746.283589] env[59857]: DEBUG nova.network.neutron [-] [instance: bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 746.297041] env[59857]: INFO nova.compute.manager [-] [instance: bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d] Took 0.08 seconds to deallocate network for instance. [ 746.457192] env[59857]: DEBUG oslo_concurrency.lockutils [None req-2ce8ad72-7513-4a44-b900-0fb4886cac5b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.038s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 746.493197] env[59857]: DEBUG nova.network.neutron [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] Successfully created port: 67a8229f-3cad-45fd-8c40-d2c3e22b636a {{(pid=59857) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 746.840037] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 748.335014] env[59857]: DEBUG nova.compute.manager [req-f6b0a07e-1d42-4af7-9f7c-85401c561d0f req-2b040c77-f48e-4dd3-a023-f99d63adffd7 service nova] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] Received event network-changed-02752945-ef11-45a3-8c97-693f994af658 {{(pid=59857) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 748.335014] env[59857]: DEBUG nova.compute.manager [req-f6b0a07e-1d42-4af7-9f7c-85401c561d0f req-2b040c77-f48e-4dd3-a023-f99d63adffd7 service nova] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] Refreshing instance network info cache due to event network-changed-02752945-ef11-45a3-8c97-693f994af658. {{(pid=59857) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 748.335300] env[59857]: DEBUG oslo_concurrency.lockutils [req-f6b0a07e-1d42-4af7-9f7c-85401c561d0f req-2b040c77-f48e-4dd3-a023-f99d63adffd7 service nova] Acquiring lock "refresh_cache-ac74db4e-ee8d-4aab-96bc-b41bc30d371b" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 748.335300] env[59857]: DEBUG oslo_concurrency.lockutils [req-f6b0a07e-1d42-4af7-9f7c-85401c561d0f req-2b040c77-f48e-4dd3-a023-f99d63adffd7 service nova] Acquired lock "refresh_cache-ac74db4e-ee8d-4aab-96bc-b41bc30d371b" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 748.335372] env[59857]: DEBUG nova.network.neutron [req-f6b0a07e-1d42-4af7-9f7c-85401c561d0f req-2b040c77-f48e-4dd3-a023-f99d63adffd7 service nova] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] Refreshing network info cache for port 02752945-ef11-45a3-8c97-693f994af658 {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 748.618611] env[59857]: DEBUG nova.network.neutron [req-f6b0a07e-1d42-4af7-9f7c-85401c561d0f req-2b040c77-f48e-4dd3-a023-f99d63adffd7 service nova] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 748.840465] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 748.840632] env[59857]: DEBUG nova.compute.manager [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Starting heal instance info cache {{(pid=59857) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 748.840869] env[59857]: DEBUG nova.compute.manager [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Rebuilding the list of instances to heal {{(pid=59857) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 748.863297] env[59857]: DEBUG nova.compute.manager [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Skipping network cache update for instance because it is Building. {{(pid=59857) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 748.863297] env[59857]: DEBUG nova.compute.manager [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Skipping network cache update for instance because it is Building. {{(pid=59857) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 748.863297] env[59857]: DEBUG nova.compute.manager [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] Skipping network cache update for instance because it is Building. {{(pid=59857) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 748.863479] env[59857]: DEBUG nova.compute.manager [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] Skipping network cache update for instance because it is Building. {{(pid=59857) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 748.863479] env[59857]: DEBUG nova.compute.manager [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] [instance: 963513ec-2280-475b-87a0-045df892e8b4] Skipping network cache update for instance because it is Building. {{(pid=59857) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 748.863609] env[59857]: DEBUG nova.compute.manager [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] Skipping network cache update for instance because it is Building. {{(pid=59857) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 748.863714] env[59857]: DEBUG nova.compute.manager [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Didn't find any instances for network info cache update. {{(pid=59857) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 748.864653] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 748.864853] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 748.864982] env[59857]: DEBUG nova.compute.manager [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59857) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 749.033189] env[59857]: DEBUG nova.network.neutron [req-f6b0a07e-1d42-4af7-9f7c-85401c561d0f req-2b040c77-f48e-4dd3-a023-f99d63adffd7 service nova] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 749.044692] env[59857]: DEBUG oslo_concurrency.lockutils [req-f6b0a07e-1d42-4af7-9f7c-85401c561d0f req-2b040c77-f48e-4dd3-a023-f99d63adffd7 service nova] Releasing lock "refresh_cache-ac74db4e-ee8d-4aab-96bc-b41bc30d371b" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 749.122039] env[59857]: ERROR nova.compute.manager [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 02752945-ef11-45a3-8c97-693f994af658, please check neutron logs for more information. [ 749.122039] env[59857]: ERROR nova.compute.manager Traceback (most recent call last): [ 749.122039] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 749.122039] env[59857]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 749.122039] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 749.122039] env[59857]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 749.122039] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 749.122039] env[59857]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 749.122039] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 749.122039] env[59857]: ERROR nova.compute.manager self.force_reraise() [ 749.122039] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 749.122039] env[59857]: ERROR nova.compute.manager raise self.value [ 749.122039] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 749.122039] env[59857]: ERROR nova.compute.manager updated_port = self._update_port( [ 749.122039] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 749.122039] env[59857]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 749.122716] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 749.122716] env[59857]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 749.122716] env[59857]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 02752945-ef11-45a3-8c97-693f994af658, please check neutron logs for more information. [ 749.122716] env[59857]: ERROR nova.compute.manager [ 749.122716] env[59857]: Traceback (most recent call last): [ 749.122716] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 749.122716] env[59857]: listener.cb(fileno) [ 749.122716] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 749.122716] env[59857]: result = function(*args, **kwargs) [ 749.122716] env[59857]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 749.122716] env[59857]: return func(*args, **kwargs) [ 749.122716] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 749.122716] env[59857]: raise e [ 749.122716] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 749.122716] env[59857]: nwinfo = self.network_api.allocate_for_instance( [ 749.122716] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 749.122716] env[59857]: created_port_ids = self._update_ports_for_instance( [ 749.122716] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 749.122716] env[59857]: with excutils.save_and_reraise_exception(): [ 749.122716] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 749.122716] env[59857]: self.force_reraise() [ 749.122716] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 749.122716] env[59857]: raise self.value [ 749.122716] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 749.122716] env[59857]: updated_port = self._update_port( [ 749.122716] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 749.122716] env[59857]: _ensure_no_port_binding_failure(port) [ 749.122716] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 749.122716] env[59857]: raise exception.PortBindingFailed(port_id=port['id']) [ 749.125361] env[59857]: nova.exception.PortBindingFailed: Binding failed for port 02752945-ef11-45a3-8c97-693f994af658, please check neutron logs for more information. [ 749.125361] env[59857]: Removing descriptor: 15 [ 749.125361] env[59857]: ERROR nova.compute.manager [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 02752945-ef11-45a3-8c97-693f994af658, please check neutron logs for more information. [ 749.125361] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] Traceback (most recent call last): [ 749.125361] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 749.125361] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] yield resources [ 749.125361] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 749.125361] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] self.driver.spawn(context, instance, image_meta, [ 749.125361] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 749.125361] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 749.125361] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 749.125361] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] vm_ref = self.build_virtual_machine(instance, [ 749.125779] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 749.125779] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] vif_infos = vmwarevif.get_vif_info(self._session, [ 749.125779] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 749.125779] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] for vif in network_info: [ 749.125779] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 749.125779] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] return self._sync_wrapper(fn, *args, **kwargs) [ 749.125779] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 749.125779] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] self.wait() [ 749.125779] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 749.125779] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] self[:] = self._gt.wait() [ 749.125779] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 749.125779] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] return self._exit_event.wait() [ 749.125779] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 749.125779] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] result = hub.switch() [ 749.126220] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 749.126220] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] return self.greenlet.switch() [ 749.126220] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 749.126220] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] result = function(*args, **kwargs) [ 749.126220] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 749.126220] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] return func(*args, **kwargs) [ 749.126220] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 749.126220] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] raise e [ 749.126220] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 749.126220] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] nwinfo = self.network_api.allocate_for_instance( [ 749.126220] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 749.126220] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] created_port_ids = self._update_ports_for_instance( [ 749.126220] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 749.126538] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] with excutils.save_and_reraise_exception(): [ 749.126538] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 749.126538] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] self.force_reraise() [ 749.126538] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 749.126538] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] raise self.value [ 749.126538] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 749.126538] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] updated_port = self._update_port( [ 749.126538] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 749.126538] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] _ensure_no_port_binding_failure(port) [ 749.126538] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 749.126538] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] raise exception.PortBindingFailed(port_id=port['id']) [ 749.126538] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] nova.exception.PortBindingFailed: Binding failed for port 02752945-ef11-45a3-8c97-693f994af658, please check neutron logs for more information. [ 749.126538] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] [ 749.128413] env[59857]: INFO nova.compute.manager [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] Terminating instance [ 749.128413] env[59857]: DEBUG oslo_concurrency.lockutils [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Acquiring lock "refresh_cache-ac74db4e-ee8d-4aab-96bc-b41bc30d371b" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 749.128413] env[59857]: DEBUG oslo_concurrency.lockutils [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Acquired lock "refresh_cache-ac74db4e-ee8d-4aab-96bc-b41bc30d371b" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 749.128413] env[59857]: DEBUG nova.network.neutron [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 749.192940] env[59857]: DEBUG nova.network.neutron [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 749.734035] env[59857]: DEBUG oslo_concurrency.lockutils [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "507d89fc-2083-4575-9a9c-f7f350741ef3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 749.734303] env[59857]: DEBUG oslo_concurrency.lockutils [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "507d89fc-2083-4575-9a9c-f7f350741ef3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 749.747499] env[59857]: DEBUG nova.compute.manager [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] Starting instance... {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 749.783308] env[59857]: DEBUG nova.network.neutron [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 749.804211] env[59857]: DEBUG oslo_concurrency.lockutils [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Releasing lock "refresh_cache-ac74db4e-ee8d-4aab-96bc-b41bc30d371b" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 749.804211] env[59857]: DEBUG nova.compute.manager [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] Start destroying the instance on the hypervisor. {{(pid=59857) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 749.804211] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] Destroying instance {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 749.804211] env[59857]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-4016cd20-ca57-4865-9203-3d4253245f75 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 749.814442] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9aa89218-bab8-45fb-8429-a0d663075bf3 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 749.851759] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 749.853637] env[59857]: WARNING nova.virt.vmwareapi.vmops [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ac74db4e-ee8d-4aab-96bc-b41bc30d371b could not be found. [ 749.853637] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] Instance destroyed {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 749.853637] env[59857]: INFO nova.compute.manager [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] Took 0.05 seconds to destroy the instance on the hypervisor. [ 749.853637] env[59857]: DEBUG oslo.service.loopingcall [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59857) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 749.854738] env[59857]: DEBUG oslo_concurrency.lockutils [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 749.854987] env[59857]: DEBUG oslo_concurrency.lockutils [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 749.856588] env[59857]: INFO nova.compute.claims [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 749.859072] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 749.859317] env[59857]: DEBUG nova.compute.manager [-] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 749.859442] env[59857]: DEBUG nova.network.neutron [-] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 749.861506] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 749.861879] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 749.862400] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager.update_available_resource {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 749.872268] env[59857]: DEBUG oslo_concurrency.lockutils [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 749.930755] env[59857]: DEBUG nova.network.neutron [-] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 749.941186] env[59857]: DEBUG nova.network.neutron [-] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 749.951678] env[59857]: INFO nova.compute.manager [-] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] Took 0.09 seconds to deallocate network for instance. [ 749.953647] env[59857]: DEBUG nova.compute.claims [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] Aborting claim: {{(pid=59857) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 749.953770] env[59857]: DEBUG oslo_concurrency.lockutils [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 750.061019] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd818d59-4608-4714-9bf4-5a56220dd4fb {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 750.069247] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0e605d6-3e20-46e8-b5d3-ccc5cea0b697 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 750.103956] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6f52c5c-257d-4a09-88d6-d26aa528db36 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 750.113000] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87e131a8-a1a9-41af-aeec-abc12a1a7d2d {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 750.129573] env[59857]: DEBUG nova.compute.provider_tree [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 750.141632] env[59857]: DEBUG nova.scheduler.client.report [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 750.161780] env[59857]: DEBUG oslo_concurrency.lockutils [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.306s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 750.162624] env[59857]: DEBUG nova.compute.manager [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] Start building networks asynchronously for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 750.165654] env[59857]: DEBUG oslo_concurrency.lockutils [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.294s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 750.165799] env[59857]: DEBUG oslo_concurrency.lockutils [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 750.165963] env[59857]: DEBUG nova.compute.resource_tracker [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59857) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 750.166259] env[59857]: DEBUG oslo_concurrency.lockutils [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.212s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 750.173091] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23c3173c-cb8e-4741-aaf0-1af13703a4d7 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 750.181255] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b505734e-be4a-48e4-9751-b6bbfbde3d85 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 750.198141] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-768e0f7c-2000-40a1-90a4-ab121a877a8f {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 750.205913] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb070b03-6569-4ec6-9879-9e9305401d16 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 750.211357] env[59857]: DEBUG nova.compute.utils [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Using /dev/sd instead of None {{(pid=59857) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 750.213279] env[59857]: DEBUG nova.compute.manager [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] Allocating IP information in the background. {{(pid=59857) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 750.213899] env[59857]: DEBUG nova.network.neutron [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] allocate_for_instance() {{(pid=59857) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 750.246068] env[59857]: DEBUG nova.compute.resource_tracker [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181510MB free_disk=154GB free_vcpus=48 pci_devices=None {{(pid=59857) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 750.246262] env[59857]: DEBUG oslo_concurrency.lockutils [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 750.247368] env[59857]: DEBUG nova.compute.manager [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] Start building block device mappings for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 750.346851] env[59857]: DEBUG nova.compute.manager [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] Start spawning the instance on the hypervisor. {{(pid=59857) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 750.374771] env[59857]: DEBUG nova.virt.hardware [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-23T14:43:14Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-23T14:42:57Z,direct_url=,disk_format='vmdk',id=4a4a4830-1ff7-4cff-ab75-d665942f46b5,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='88755ffc9d394d14928f0692771e61f5',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-23T14:42:58Z,virtual_size=,visibility=), allow threads: False {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 750.375019] env[59857]: DEBUG nova.virt.hardware [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Flavor limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 750.375174] env[59857]: DEBUG nova.virt.hardware [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Image limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 750.375352] env[59857]: DEBUG nova.virt.hardware [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Flavor pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 750.375491] env[59857]: DEBUG nova.virt.hardware [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Image pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 750.375633] env[59857]: DEBUG nova.virt.hardware [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 750.375837] env[59857]: DEBUG nova.virt.hardware [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 750.375990] env[59857]: DEBUG nova.virt.hardware [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 750.376195] env[59857]: DEBUG nova.virt.hardware [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Got 1 possible topologies {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 750.376327] env[59857]: DEBUG nova.virt.hardware [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 750.376494] env[59857]: DEBUG nova.virt.hardware [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 750.377465] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a6a83e8-39c3-472a-8061-2bd2b3305210 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 750.391865] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2732bd91-4e5e-4431-9aba-9b07538e6a89 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 750.396036] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-414dd5bc-55fd-4782-acd9-24e315b44306 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 750.411178] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9639b913-f613-4853-9305-f76d3714cb13 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 750.450415] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42adf49f-9262-45bf-b0a5-1da6a0dd1dd6 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 750.459447] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb1832d7-d2fb-46a9-8b53-412cf9bef998 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 750.479379] env[59857]: DEBUG nova.compute.provider_tree [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 750.489614] env[59857]: DEBUG nova.scheduler.client.report [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 750.505872] env[59857]: DEBUG oslo_concurrency.lockutils [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.339s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 750.506342] env[59857]: ERROR nova.compute.manager [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 02752945-ef11-45a3-8c97-693f994af658, please check neutron logs for more information. [ 750.506342] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] Traceback (most recent call last): [ 750.506342] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 750.506342] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] self.driver.spawn(context, instance, image_meta, [ 750.506342] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 750.506342] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 750.506342] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 750.506342] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] vm_ref = self.build_virtual_machine(instance, [ 750.506342] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 750.506342] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] vif_infos = vmwarevif.get_vif_info(self._session, [ 750.506342] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 750.506815] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] for vif in network_info: [ 750.506815] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 750.506815] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] return self._sync_wrapper(fn, *args, **kwargs) [ 750.506815] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 750.506815] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] self.wait() [ 750.506815] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 750.506815] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] self[:] = self._gt.wait() [ 750.506815] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 750.506815] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] return self._exit_event.wait() [ 750.506815] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 750.506815] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] result = hub.switch() [ 750.506815] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 750.506815] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] return self.greenlet.switch() [ 750.506815] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 750.507330] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] result = function(*args, **kwargs) [ 750.507330] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 750.507330] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] return func(*args, **kwargs) [ 750.507330] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 750.507330] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] raise e [ 750.507330] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 750.507330] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] nwinfo = self.network_api.allocate_for_instance( [ 750.507330] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 750.507330] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] created_port_ids = self._update_ports_for_instance( [ 750.507330] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 750.507330] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] with excutils.save_and_reraise_exception(): [ 750.507330] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 750.507330] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] self.force_reraise() [ 750.507870] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 750.507870] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] raise self.value [ 750.507870] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 750.507870] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] updated_port = self._update_port( [ 750.507870] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 750.507870] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] _ensure_no_port_binding_failure(port) [ 750.507870] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 750.507870] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] raise exception.PortBindingFailed(port_id=port['id']) [ 750.507870] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] nova.exception.PortBindingFailed: Binding failed for port 02752945-ef11-45a3-8c97-693f994af658, please check neutron logs for more information. [ 750.507870] env[59857]: ERROR nova.compute.manager [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] [ 750.508252] env[59857]: DEBUG nova.compute.utils [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] Binding failed for port 02752945-ef11-45a3-8c97-693f994af658, please check neutron logs for more information. {{(pid=59857) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 750.509261] env[59857]: DEBUG oslo_concurrency.lockutils [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.263s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 750.511102] env[59857]: DEBUG nova.compute.manager [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] Build of instance ac74db4e-ee8d-4aab-96bc-b41bc30d371b was re-scheduled: Binding failed for port 02752945-ef11-45a3-8c97-693f994af658, please check neutron logs for more information. {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 750.511560] env[59857]: DEBUG nova.compute.manager [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] Unplugging VIFs for instance {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 750.511883] env[59857]: DEBUG oslo_concurrency.lockutils [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Acquiring lock "refresh_cache-ac74db4e-ee8d-4aab-96bc-b41bc30d371b" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 750.511971] env[59857]: DEBUG oslo_concurrency.lockutils [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Acquired lock "refresh_cache-ac74db4e-ee8d-4aab-96bc-b41bc30d371b" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 750.512074] env[59857]: DEBUG nova.network.neutron [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 750.569890] env[59857]: DEBUG nova.policy [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9ae0c3fdf5814c20819e4329e87733e3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd742fb05f93f44a9b9c8207f47e77730', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59857) authorize /opt/stack/nova/nova/policy.py:203}} [ 750.598162] env[59857]: DEBUG nova.compute.resource_tracker [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Instance 5c575e05-5a7c-49b8-b914-9b4a4e347bfc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59857) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 750.598328] env[59857]: DEBUG nova.compute.resource_tracker [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Instance 11f468ba-a807-4490-9dd5-58eaad007865 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59857) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 750.598451] env[59857]: DEBUG nova.compute.resource_tracker [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Instance 26aa196e-e745-494d-814f-7da3cf18ec14 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59857) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 750.604096] env[59857]: DEBUG nova.network.neutron [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 750.628600] env[59857]: DEBUG nova.compute.resource_tracker [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Instance ac74db4e-ee8d-4aab-96bc-b41bc30d371b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59857) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 750.628771] env[59857]: DEBUG nova.compute.resource_tracker [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Instance 963513ec-2280-475b-87a0-045df892e8b4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59857) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 750.630130] env[59857]: DEBUG nova.compute.resource_tracker [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Instance f12524a7-21b9-4e35-b15b-955627d58c7a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59857) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 750.630130] env[59857]: DEBUG nova.compute.resource_tracker [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Instance 507d89fc-2083-4575-9a9c-f7f350741ef3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59857) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 750.630130] env[59857]: DEBUG nova.compute.resource_tracker [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Total usable vcpus: 48, total allocated vcpus: 6 {{(pid=59857) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 750.630130] env[59857]: DEBUG nova.compute.resource_tracker [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1280MB phys_disk=200GB used_disk=6GB total_vcpus=48 used_vcpus=6 pci_stats=[] {{(pid=59857) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 750.700285] env[59857]: DEBUG nova.compute.manager [req-be815dde-9355-4d1a-8a14-080cdbf08c80 req-f5a93695-2249-42f7-84e8-91be7652a040 service nova] [instance: 963513ec-2280-475b-87a0-045df892e8b4] Received event network-changed-64249b93-1271-4bd9-b7ca-deae2d68e0fd {{(pid=59857) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 750.700503] env[59857]: DEBUG nova.compute.manager [req-be815dde-9355-4d1a-8a14-080cdbf08c80 req-f5a93695-2249-42f7-84e8-91be7652a040 service nova] [instance: 963513ec-2280-475b-87a0-045df892e8b4] Refreshing instance network info cache due to event network-changed-64249b93-1271-4bd9-b7ca-deae2d68e0fd. {{(pid=59857) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 750.700682] env[59857]: DEBUG oslo_concurrency.lockutils [req-be815dde-9355-4d1a-8a14-080cdbf08c80 req-f5a93695-2249-42f7-84e8-91be7652a040 service nova] Acquiring lock "refresh_cache-963513ec-2280-475b-87a0-045df892e8b4" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 750.700814] env[59857]: DEBUG oslo_concurrency.lockutils [req-be815dde-9355-4d1a-8a14-080cdbf08c80 req-f5a93695-2249-42f7-84e8-91be7652a040 service nova] Acquired lock "refresh_cache-963513ec-2280-475b-87a0-045df892e8b4" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 750.700960] env[59857]: DEBUG nova.network.neutron [req-be815dde-9355-4d1a-8a14-080cdbf08c80 req-f5a93695-2249-42f7-84e8-91be7652a040 service nova] [instance: 963513ec-2280-475b-87a0-045df892e8b4] Refreshing network info cache for port 64249b93-1271-4bd9-b7ca-deae2d68e0fd {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 750.771474] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e9f0b36-d6e1-4f52-9589-5e7420b63593 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 750.779697] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a1d3ef7-eea1-48ec-a724-678b4e0b9185 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 750.810754] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c082077c-567f-4fd2-84f8-961480be2051 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 750.819801] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99005322-e28c-4b46-8fb8-896c06131cca {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 750.834464] env[59857]: DEBUG nova.compute.provider_tree [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 750.844963] env[59857]: DEBUG nova.scheduler.client.report [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 750.863718] env[59857]: DEBUG nova.compute.resource_tracker [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59857) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 750.863907] env[59857]: DEBUG oslo_concurrency.lockutils [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.355s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 750.975308] env[59857]: DEBUG nova.network.neutron [req-be815dde-9355-4d1a-8a14-080cdbf08c80 req-f5a93695-2249-42f7-84e8-91be7652a040 service nova] [instance: 963513ec-2280-475b-87a0-045df892e8b4] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 751.196422] env[59857]: ERROR nova.compute.manager [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 64249b93-1271-4bd9-b7ca-deae2d68e0fd, please check neutron logs for more information. [ 751.196422] env[59857]: ERROR nova.compute.manager Traceback (most recent call last): [ 751.196422] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 751.196422] env[59857]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 751.196422] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 751.196422] env[59857]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 751.196422] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 751.196422] env[59857]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 751.196422] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 751.196422] env[59857]: ERROR nova.compute.manager self.force_reraise() [ 751.196422] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 751.196422] env[59857]: ERROR nova.compute.manager raise self.value [ 751.196422] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 751.196422] env[59857]: ERROR nova.compute.manager updated_port = self._update_port( [ 751.196422] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 751.196422] env[59857]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 751.196951] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 751.196951] env[59857]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 751.196951] env[59857]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 64249b93-1271-4bd9-b7ca-deae2d68e0fd, please check neutron logs for more information. [ 751.196951] env[59857]: ERROR nova.compute.manager [ 751.196951] env[59857]: Traceback (most recent call last): [ 751.196951] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 751.196951] env[59857]: listener.cb(fileno) [ 751.196951] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 751.196951] env[59857]: result = function(*args, **kwargs) [ 751.196951] env[59857]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 751.196951] env[59857]: return func(*args, **kwargs) [ 751.196951] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 751.196951] env[59857]: raise e [ 751.196951] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 751.196951] env[59857]: nwinfo = self.network_api.allocate_for_instance( [ 751.196951] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 751.196951] env[59857]: created_port_ids = self._update_ports_for_instance( [ 751.196951] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 751.196951] env[59857]: with excutils.save_and_reraise_exception(): [ 751.196951] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 751.196951] env[59857]: self.force_reraise() [ 751.196951] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 751.196951] env[59857]: raise self.value [ 751.196951] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 751.196951] env[59857]: updated_port = self._update_port( [ 751.196951] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 751.196951] env[59857]: _ensure_no_port_binding_failure(port) [ 751.196951] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 751.196951] env[59857]: raise exception.PortBindingFailed(port_id=port['id']) [ 751.197753] env[59857]: nova.exception.PortBindingFailed: Binding failed for port 64249b93-1271-4bd9-b7ca-deae2d68e0fd, please check neutron logs for more information. [ 751.197753] env[59857]: Removing descriptor: 12 [ 751.197753] env[59857]: ERROR nova.compute.manager [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] [instance: 963513ec-2280-475b-87a0-045df892e8b4] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 64249b93-1271-4bd9-b7ca-deae2d68e0fd, please check neutron logs for more information. [ 751.197753] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] Traceback (most recent call last): [ 751.197753] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 751.197753] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] yield resources [ 751.197753] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 751.197753] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] self.driver.spawn(context, instance, image_meta, [ 751.197753] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 751.197753] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 751.197753] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 751.197753] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] vm_ref = self.build_virtual_machine(instance, [ 751.198193] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 751.198193] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] vif_infos = vmwarevif.get_vif_info(self._session, [ 751.198193] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 751.198193] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] for vif in network_info: [ 751.198193] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 751.198193] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] return self._sync_wrapper(fn, *args, **kwargs) [ 751.198193] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 751.198193] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] self.wait() [ 751.198193] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 751.198193] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] self[:] = self._gt.wait() [ 751.198193] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 751.198193] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] return self._exit_event.wait() [ 751.198193] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 751.198193] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] result = hub.switch() [ 751.198587] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 751.198587] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] return self.greenlet.switch() [ 751.198587] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 751.198587] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] result = function(*args, **kwargs) [ 751.198587] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 751.198587] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] return func(*args, **kwargs) [ 751.198587] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 751.198587] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] raise e [ 751.198587] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 751.198587] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] nwinfo = self.network_api.allocate_for_instance( [ 751.198587] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 751.198587] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] created_port_ids = self._update_ports_for_instance( [ 751.198587] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 751.198966] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] with excutils.save_and_reraise_exception(): [ 751.198966] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 751.198966] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] self.force_reraise() [ 751.198966] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 751.198966] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] raise self.value [ 751.198966] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 751.198966] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] updated_port = self._update_port( [ 751.198966] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 751.198966] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] _ensure_no_port_binding_failure(port) [ 751.198966] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 751.198966] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] raise exception.PortBindingFailed(port_id=port['id']) [ 751.198966] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] nova.exception.PortBindingFailed: Binding failed for port 64249b93-1271-4bd9-b7ca-deae2d68e0fd, please check neutron logs for more information. [ 751.198966] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] [ 751.199343] env[59857]: INFO nova.compute.manager [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] [instance: 963513ec-2280-475b-87a0-045df892e8b4] Terminating instance [ 751.203988] env[59857]: DEBUG oslo_concurrency.lockutils [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Acquiring lock "refresh_cache-963513ec-2280-475b-87a0-045df892e8b4" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 751.426977] env[59857]: DEBUG nova.network.neutron [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 751.438359] env[59857]: DEBUG oslo_concurrency.lockutils [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Acquiring lock "f3b9789c-ecc6-4b1a-96ec-2c71dba363f7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 751.438828] env[59857]: DEBUG oslo_concurrency.lockutils [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Lock "f3b9789c-ecc6-4b1a-96ec-2c71dba363f7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 751.440840] env[59857]: DEBUG oslo_concurrency.lockutils [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Releasing lock "refresh_cache-ac74db4e-ee8d-4aab-96bc-b41bc30d371b" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 751.441038] env[59857]: DEBUG nova.compute.manager [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 751.441213] env[59857]: DEBUG nova.compute.manager [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 751.441369] env[59857]: DEBUG nova.network.neutron [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 751.447684] env[59857]: DEBUG nova.compute.manager [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] Starting instance... {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 751.499370] env[59857]: DEBUG oslo_concurrency.lockutils [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 751.499758] env[59857]: DEBUG oslo_concurrency.lockutils [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 751.503214] env[59857]: INFO nova.compute.claims [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 751.512870] env[59857]: DEBUG nova.network.neutron [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 751.525376] env[59857]: DEBUG nova.network.neutron [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 751.550811] env[59857]: INFO nova.compute.manager [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] Took 0.11 seconds to deallocate network for instance. [ 751.553843] env[59857]: DEBUG nova.network.neutron [req-be815dde-9355-4d1a-8a14-080cdbf08c80 req-f5a93695-2249-42f7-84e8-91be7652a040 service nova] [instance: 963513ec-2280-475b-87a0-045df892e8b4] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 751.576052] env[59857]: DEBUG oslo_concurrency.lockutils [req-be815dde-9355-4d1a-8a14-080cdbf08c80 req-f5a93695-2249-42f7-84e8-91be7652a040 service nova] Releasing lock "refresh_cache-963513ec-2280-475b-87a0-045df892e8b4" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 751.576390] env[59857]: DEBUG oslo_concurrency.lockutils [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Acquired lock "refresh_cache-963513ec-2280-475b-87a0-045df892e8b4" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 751.576607] env[59857]: DEBUG nova.network.neutron [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] [instance: 963513ec-2280-475b-87a0-045df892e8b4] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 751.674998] env[59857]: INFO nova.scheduler.client.report [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Deleted allocations for instance ac74db4e-ee8d-4aab-96bc-b41bc30d371b [ 751.687971] env[59857]: DEBUG nova.network.neutron [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] [instance: 963513ec-2280-475b-87a0-045df892e8b4] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 751.698329] env[59857]: DEBUG oslo_concurrency.lockutils [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Lock "ac74db4e-ee8d-4aab-96bc-b41bc30d371b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 12.472s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 751.731022] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57ac4c59-5d17-4ad2-afd4-dee155cb39f2 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 751.739235] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e204be24-ef02-411e-953e-ebe6ccde799b {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 751.777532] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be83aa8a-6df5-4c4a-8408-e4d2409a9ac5 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 751.786624] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1082bc6d-4704-471f-8150-81f9bd40573f {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 751.801889] env[59857]: DEBUG nova.compute.provider_tree [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 751.811844] env[59857]: DEBUG nova.scheduler.client.report [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 751.832159] env[59857]: DEBUG oslo_concurrency.lockutils [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.331s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 751.832159] env[59857]: DEBUG nova.compute.manager [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] Start building networks asynchronously for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 751.869619] env[59857]: DEBUG nova.compute.utils [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Using /dev/sd instead of None {{(pid=59857) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 751.871795] env[59857]: DEBUG nova.compute.manager [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] Allocating IP information in the background. {{(pid=59857) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 751.875025] env[59857]: DEBUG nova.network.neutron [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] allocate_for_instance() {{(pid=59857) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 751.885326] env[59857]: DEBUG nova.compute.manager [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] Start building block device mappings for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 751.968771] env[59857]: DEBUG nova.compute.manager [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] Start spawning the instance on the hypervisor. {{(pid=59857) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 751.992852] env[59857]: DEBUG nova.virt.hardware [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-23T14:43:14Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-23T14:42:57Z,direct_url=,disk_format='vmdk',id=4a4a4830-1ff7-4cff-ab75-d665942f46b5,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='88755ffc9d394d14928f0692771e61f5',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-23T14:42:58Z,virtual_size=,visibility=), allow threads: False {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 751.993115] env[59857]: DEBUG nova.virt.hardware [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Flavor limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 751.993286] env[59857]: DEBUG nova.virt.hardware [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Image limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 751.993465] env[59857]: DEBUG nova.virt.hardware [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Flavor pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 751.993606] env[59857]: DEBUG nova.virt.hardware [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Image pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 751.993748] env[59857]: DEBUG nova.virt.hardware [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 751.993947] env[59857]: DEBUG nova.virt.hardware [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 751.994364] env[59857]: DEBUG nova.virt.hardware [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 751.994585] env[59857]: DEBUG nova.virt.hardware [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Got 1 possible topologies {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 751.994978] env[59857]: DEBUG nova.virt.hardware [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 751.995190] env[59857]: DEBUG nova.virt.hardware [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 751.996053] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd5d159d-3cc3-43db-8a73-dd035d83e4e6 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 752.005416] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-72bd9c3f-c0a2-4bad-8153-f944a5eb94b8 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 752.251599] env[59857]: DEBUG nova.policy [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4c92e5f993844571be7b606f48976f9b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c884340032e64abcbc9e405b7da4cb6f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59857) authorize /opt/stack/nova/nova/policy.py:203}} [ 752.297548] env[59857]: DEBUG nova.network.neutron [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] [instance: 963513ec-2280-475b-87a0-045df892e8b4] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 752.308019] env[59857]: DEBUG oslo_concurrency.lockutils [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Releasing lock "refresh_cache-963513ec-2280-475b-87a0-045df892e8b4" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 752.308019] env[59857]: DEBUG nova.compute.manager [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] [instance: 963513ec-2280-475b-87a0-045df892e8b4] Start destroying the instance on the hypervisor. {{(pid=59857) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 752.308019] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] [instance: 963513ec-2280-475b-87a0-045df892e8b4] Destroying instance {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 752.308019] env[59857]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-39445765-172e-4b5c-b150-04f3803add89 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 752.322015] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e892631-be7c-444b-ad4e-234a1c71328b {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 752.345652] env[59857]: WARNING nova.virt.vmwareapi.vmops [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] [instance: 963513ec-2280-475b-87a0-045df892e8b4] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 963513ec-2280-475b-87a0-045df892e8b4 could not be found. [ 752.346050] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] [instance: 963513ec-2280-475b-87a0-045df892e8b4] Instance destroyed {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 752.346984] env[59857]: INFO nova.compute.manager [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] [instance: 963513ec-2280-475b-87a0-045df892e8b4] Took 0.04 seconds to destroy the instance on the hypervisor. [ 752.347679] env[59857]: DEBUG oslo.service.loopingcall [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59857) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 752.348288] env[59857]: DEBUG nova.compute.manager [-] [instance: 963513ec-2280-475b-87a0-045df892e8b4] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 752.348484] env[59857]: DEBUG nova.network.neutron [-] [instance: 963513ec-2280-475b-87a0-045df892e8b4] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 752.402253] env[59857]: DEBUG nova.network.neutron [-] [instance: 963513ec-2280-475b-87a0-045df892e8b4] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 752.409305] env[59857]: DEBUG nova.network.neutron [-] [instance: 963513ec-2280-475b-87a0-045df892e8b4] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 752.425020] env[59857]: INFO nova.compute.manager [-] [instance: 963513ec-2280-475b-87a0-045df892e8b4] Took 0.07 seconds to deallocate network for instance. [ 752.425020] env[59857]: DEBUG nova.compute.claims [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] [instance: 963513ec-2280-475b-87a0-045df892e8b4] Aborting claim: {{(pid=59857) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 752.425020] env[59857]: DEBUG oslo_concurrency.lockutils [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 752.425020] env[59857]: DEBUG oslo_concurrency.lockutils [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 752.497282] env[59857]: DEBUG nova.network.neutron [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] Successfully created port: 3c3207c4-43fc-434a-b522-b2b074acdf74 {{(pid=59857) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 752.601417] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4018f5bc-9f83-4436-8b1e-ba19c9585381 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 752.610100] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c79389a9-df3b-4c00-8f5d-c337a1b4d224 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 752.649521] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0e88ff2-aa86-4548-97d2-9a0efa4d7642 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 752.658256] env[59857]: DEBUG oslo_concurrency.lockutils [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Acquiring lock "3835af93-7a47-4d3c-9296-256aadddc3b3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 752.659243] env[59857]: DEBUG oslo_concurrency.lockutils [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Lock "3835af93-7a47-4d3c-9296-256aadddc3b3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 752.664817] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8adeba89-0bd7-4a09-ac53-65a63e75d25f {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 752.670860] env[59857]: DEBUG nova.compute.manager [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] Starting instance... {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 752.683409] env[59857]: DEBUG nova.compute.provider_tree [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 752.693130] env[59857]: DEBUG nova.scheduler.client.report [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 752.710631] env[59857]: DEBUG oslo_concurrency.lockutils [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.285s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 752.711550] env[59857]: ERROR nova.compute.manager [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] [instance: 963513ec-2280-475b-87a0-045df892e8b4] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 64249b93-1271-4bd9-b7ca-deae2d68e0fd, please check neutron logs for more information. [ 752.711550] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] Traceback (most recent call last): [ 752.711550] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 752.711550] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] self.driver.spawn(context, instance, image_meta, [ 752.711550] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 752.711550] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 752.711550] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 752.711550] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] vm_ref = self.build_virtual_machine(instance, [ 752.711550] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 752.711550] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] vif_infos = vmwarevif.get_vif_info(self._session, [ 752.711550] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 752.712081] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] for vif in network_info: [ 752.712081] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 752.712081] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] return self._sync_wrapper(fn, *args, **kwargs) [ 752.712081] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 752.712081] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] self.wait() [ 752.712081] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 752.712081] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] self[:] = self._gt.wait() [ 752.712081] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 752.712081] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] return self._exit_event.wait() [ 752.712081] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 752.712081] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] result = hub.switch() [ 752.712081] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 752.712081] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] return self.greenlet.switch() [ 752.712081] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 752.712578] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] result = function(*args, **kwargs) [ 752.712578] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 752.712578] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] return func(*args, **kwargs) [ 752.712578] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 752.712578] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] raise e [ 752.712578] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 752.712578] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] nwinfo = self.network_api.allocate_for_instance( [ 752.712578] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 752.712578] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] created_port_ids = self._update_ports_for_instance( [ 752.712578] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 752.712578] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] with excutils.save_and_reraise_exception(): [ 752.712578] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 752.712578] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] self.force_reraise() [ 752.713031] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 752.713031] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] raise self.value [ 752.713031] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 752.713031] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] updated_port = self._update_port( [ 752.713031] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 752.713031] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] _ensure_no_port_binding_failure(port) [ 752.713031] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 752.713031] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] raise exception.PortBindingFailed(port_id=port['id']) [ 752.713031] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] nova.exception.PortBindingFailed: Binding failed for port 64249b93-1271-4bd9-b7ca-deae2d68e0fd, please check neutron logs for more information. [ 752.713031] env[59857]: ERROR nova.compute.manager [instance: 963513ec-2280-475b-87a0-045df892e8b4] [ 752.713031] env[59857]: DEBUG nova.compute.utils [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] [instance: 963513ec-2280-475b-87a0-045df892e8b4] Binding failed for port 64249b93-1271-4bd9-b7ca-deae2d68e0fd, please check neutron logs for more information. {{(pid=59857) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 752.714637] env[59857]: DEBUG nova.compute.manager [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] [instance: 963513ec-2280-475b-87a0-045df892e8b4] Build of instance 963513ec-2280-475b-87a0-045df892e8b4 was re-scheduled: Binding failed for port 64249b93-1271-4bd9-b7ca-deae2d68e0fd, please check neutron logs for more information. {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 752.715066] env[59857]: DEBUG nova.compute.manager [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] [instance: 963513ec-2280-475b-87a0-045df892e8b4] Unplugging VIFs for instance {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 752.715298] env[59857]: DEBUG oslo_concurrency.lockutils [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Acquiring lock "refresh_cache-963513ec-2280-475b-87a0-045df892e8b4" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 752.715440] env[59857]: DEBUG oslo_concurrency.lockutils [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Acquired lock "refresh_cache-963513ec-2280-475b-87a0-045df892e8b4" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 752.715601] env[59857]: DEBUG nova.network.neutron [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] [instance: 963513ec-2280-475b-87a0-045df892e8b4] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 752.735402] env[59857]: DEBUG oslo_concurrency.lockutils [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 752.735646] env[59857]: DEBUG oslo_concurrency.lockutils [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 752.737792] env[59857]: INFO nova.compute.claims [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 752.830930] env[59857]: DEBUG nova.network.neutron [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] [instance: 963513ec-2280-475b-87a0-045df892e8b4] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 752.956114] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2fea64b-130c-476f-a7fd-9748934d69c8 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 752.966359] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5ad52ca-258a-4c0a-8b26-ac8a31229440 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 753.000015] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd7d085e-8f5b-4fab-be76-de90ef09fe49 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 753.009804] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01435573-aab2-4bc5-bab2-a7df27d080df {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 753.027233] env[59857]: DEBUG nova.compute.provider_tree [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 753.038598] env[59857]: DEBUG nova.scheduler.client.report [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 753.058019] env[59857]: DEBUG oslo_concurrency.lockutils [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.320s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 753.058019] env[59857]: DEBUG nova.compute.manager [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] Start building networks asynchronously for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 753.100366] env[59857]: DEBUG nova.compute.utils [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Using /dev/sd instead of None {{(pid=59857) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 753.101602] env[59857]: DEBUG nova.compute.manager [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] Allocating IP information in the background. {{(pid=59857) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 753.101914] env[59857]: DEBUG nova.network.neutron [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] allocate_for_instance() {{(pid=59857) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 753.116344] env[59857]: DEBUG nova.compute.manager [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] Start building block device mappings for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 753.200788] env[59857]: DEBUG nova.compute.manager [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] Start spawning the instance on the hypervisor. {{(pid=59857) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 753.231738] env[59857]: DEBUG nova.virt.hardware [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-23T14:43:14Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-23T14:42:57Z,direct_url=,disk_format='vmdk',id=4a4a4830-1ff7-4cff-ab75-d665942f46b5,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='88755ffc9d394d14928f0692771e61f5',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-23T14:42:58Z,virtual_size=,visibility=), allow threads: False {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 753.231987] env[59857]: DEBUG nova.virt.hardware [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Flavor limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 753.232145] env[59857]: DEBUG nova.virt.hardware [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Image limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 753.232327] env[59857]: DEBUG nova.virt.hardware [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Flavor pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 753.232465] env[59857]: DEBUG nova.virt.hardware [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Image pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 753.232607] env[59857]: DEBUG nova.virt.hardware [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 753.232807] env[59857]: DEBUG nova.virt.hardware [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 753.232953] env[59857]: DEBUG nova.virt.hardware [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 753.233995] env[59857]: DEBUG nova.virt.hardware [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Got 1 possible topologies {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 753.234291] env[59857]: DEBUG nova.virt.hardware [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 753.234560] env[59857]: DEBUG nova.virt.hardware [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 753.237068] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ba04b8c-d765-4ae8-bb3e-19c52f12a305 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 753.245823] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6749842-5ed3-4b0f-b9e4-5696d531c48e {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 753.346881] env[59857]: DEBUG oslo_concurrency.lockutils [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Acquiring lock "67b672b6-c6cb-4dc2-9d75-fb028195a0dd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 753.347153] env[59857]: DEBUG oslo_concurrency.lockutils [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Lock "67b672b6-c6cb-4dc2-9d75-fb028195a0dd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 753.361102] env[59857]: DEBUG nova.compute.manager [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] Starting instance... {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 753.422158] env[59857]: DEBUG oslo_concurrency.lockutils [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 753.422659] env[59857]: DEBUG oslo_concurrency.lockutils [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 753.425893] env[59857]: INFO nova.compute.claims [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 753.493071] env[59857]: DEBUG nova.policy [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '18ef365c1e9342b7b0354ea96850399f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '27d36caa344f42aca3919c92d468bbd6', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59857) authorize /opt/stack/nova/nova/policy.py:203}} [ 753.601831] env[59857]: DEBUG nova.compute.manager [req-98b485ad-0d21-4328-b8ec-68433deee095 req-e922b463-2547-417a-bec2-38dddaaa7bc8 service nova] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] Received event network-changed-67a8229f-3cad-45fd-8c40-d2c3e22b636a {{(pid=59857) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 753.601996] env[59857]: DEBUG nova.compute.manager [req-98b485ad-0d21-4328-b8ec-68433deee095 req-e922b463-2547-417a-bec2-38dddaaa7bc8 service nova] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] Refreshing instance network info cache due to event network-changed-67a8229f-3cad-45fd-8c40-d2c3e22b636a. {{(pid=59857) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 753.602220] env[59857]: DEBUG oslo_concurrency.lockutils [req-98b485ad-0d21-4328-b8ec-68433deee095 req-e922b463-2547-417a-bec2-38dddaaa7bc8 service nova] Acquiring lock "refresh_cache-f12524a7-21b9-4e35-b15b-955627d58c7a" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 753.602349] env[59857]: DEBUG oslo_concurrency.lockutils [req-98b485ad-0d21-4328-b8ec-68433deee095 req-e922b463-2547-417a-bec2-38dddaaa7bc8 service nova] Acquired lock "refresh_cache-f12524a7-21b9-4e35-b15b-955627d58c7a" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 753.602497] env[59857]: DEBUG nova.network.neutron [req-98b485ad-0d21-4328-b8ec-68433deee095 req-e922b463-2547-417a-bec2-38dddaaa7bc8 service nova] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] Refreshing network info cache for port 67a8229f-3cad-45fd-8c40-d2c3e22b636a {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 753.664431] env[59857]: DEBUG nova.network.neutron [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] [instance: 963513ec-2280-475b-87a0-045df892e8b4] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 753.670720] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd10e761-5d60-4a5a-9f42-fee055511099 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 753.677278] env[59857]: DEBUG oslo_concurrency.lockutils [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Releasing lock "refresh_cache-963513ec-2280-475b-87a0-045df892e8b4" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 753.677538] env[59857]: DEBUG nova.compute.manager [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 753.677690] env[59857]: DEBUG nova.compute.manager [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] [instance: 963513ec-2280-475b-87a0-045df892e8b4] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 753.677867] env[59857]: DEBUG nova.network.neutron [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] [instance: 963513ec-2280-475b-87a0-045df892e8b4] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 753.683592] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0cf40016-b9bc-4f0f-9c22-8b552f6b0cd8 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 753.722654] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-64c8a69d-27f2-4287-a557-9d575dc6f003 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 753.732457] env[59857]: DEBUG nova.network.neutron [req-98b485ad-0d21-4328-b8ec-68433deee095 req-e922b463-2547-417a-bec2-38dddaaa7bc8 service nova] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 753.735621] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-390aa542-a06d-48bb-b1cd-88b26a79b57c {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 753.752898] env[59857]: DEBUG nova.compute.provider_tree [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 753.755036] env[59857]: ERROR nova.compute.manager [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 67a8229f-3cad-45fd-8c40-d2c3e22b636a, please check neutron logs for more information. [ 753.755036] env[59857]: ERROR nova.compute.manager Traceback (most recent call last): [ 753.755036] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 753.755036] env[59857]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 753.755036] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 753.755036] env[59857]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 753.755036] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 753.755036] env[59857]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 753.755036] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 753.755036] env[59857]: ERROR nova.compute.manager self.force_reraise() [ 753.755036] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 753.755036] env[59857]: ERROR nova.compute.manager raise self.value [ 753.755036] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 753.755036] env[59857]: ERROR nova.compute.manager updated_port = self._update_port( [ 753.755036] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 753.755036] env[59857]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 753.755669] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 753.755669] env[59857]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 753.755669] env[59857]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 67a8229f-3cad-45fd-8c40-d2c3e22b636a, please check neutron logs for more information. [ 753.755669] env[59857]: ERROR nova.compute.manager [ 753.755669] env[59857]: Traceback (most recent call last): [ 753.755669] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 753.755669] env[59857]: listener.cb(fileno) [ 753.755669] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 753.755669] env[59857]: result = function(*args, **kwargs) [ 753.755669] env[59857]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 753.755669] env[59857]: return func(*args, **kwargs) [ 753.755669] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 753.755669] env[59857]: raise e [ 753.755669] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 753.755669] env[59857]: nwinfo = self.network_api.allocate_for_instance( [ 753.755669] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 753.755669] env[59857]: created_port_ids = self._update_ports_for_instance( [ 753.755669] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 753.755669] env[59857]: with excutils.save_and_reraise_exception(): [ 753.755669] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 753.755669] env[59857]: self.force_reraise() [ 753.755669] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 753.755669] env[59857]: raise self.value [ 753.755669] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 753.755669] env[59857]: updated_port = self._update_port( [ 753.755669] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 753.755669] env[59857]: _ensure_no_port_binding_failure(port) [ 753.755669] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 753.755669] env[59857]: raise exception.PortBindingFailed(port_id=port['id']) [ 753.756880] env[59857]: nova.exception.PortBindingFailed: Binding failed for port 67a8229f-3cad-45fd-8c40-d2c3e22b636a, please check neutron logs for more information. [ 753.756880] env[59857]: Removing descriptor: 17 [ 753.756880] env[59857]: ERROR nova.compute.manager [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 67a8229f-3cad-45fd-8c40-d2c3e22b636a, please check neutron logs for more information. [ 753.756880] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] Traceback (most recent call last): [ 753.756880] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 753.756880] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] yield resources [ 753.756880] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 753.756880] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] self.driver.spawn(context, instance, image_meta, [ 753.756880] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 753.756880] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 753.756880] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 753.756880] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] vm_ref = self.build_virtual_machine(instance, [ 753.757406] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 753.757406] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] vif_infos = vmwarevif.get_vif_info(self._session, [ 753.757406] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 753.757406] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] for vif in network_info: [ 753.757406] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 753.757406] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] return self._sync_wrapper(fn, *args, **kwargs) [ 753.757406] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 753.757406] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] self.wait() [ 753.757406] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 753.757406] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] self[:] = self._gt.wait() [ 753.757406] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 753.757406] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] return self._exit_event.wait() [ 753.757406] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 753.757406] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] result = hub.switch() [ 753.758051] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 753.758051] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] return self.greenlet.switch() [ 753.758051] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 753.758051] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] result = function(*args, **kwargs) [ 753.758051] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 753.758051] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] return func(*args, **kwargs) [ 753.758051] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 753.758051] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] raise e [ 753.758051] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 753.758051] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] nwinfo = self.network_api.allocate_for_instance( [ 753.758051] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 753.758051] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] created_port_ids = self._update_ports_for_instance( [ 753.758051] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 753.758404] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] with excutils.save_and_reraise_exception(): [ 753.758404] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 753.758404] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] self.force_reraise() [ 753.758404] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 753.758404] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] raise self.value [ 753.758404] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 753.758404] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] updated_port = self._update_port( [ 753.758404] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 753.758404] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] _ensure_no_port_binding_failure(port) [ 753.758404] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 753.758404] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] raise exception.PortBindingFailed(port_id=port['id']) [ 753.758404] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] nova.exception.PortBindingFailed: Binding failed for port 67a8229f-3cad-45fd-8c40-d2c3e22b636a, please check neutron logs for more information. [ 753.758404] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] [ 753.758735] env[59857]: INFO nova.compute.manager [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] Terminating instance [ 753.758735] env[59857]: DEBUG oslo_concurrency.lockutils [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Acquiring lock "refresh_cache-f12524a7-21b9-4e35-b15b-955627d58c7a" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 753.764469] env[59857]: DEBUG nova.scheduler.client.report [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 753.788082] env[59857]: DEBUG oslo_concurrency.lockutils [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.365s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 753.788921] env[59857]: DEBUG nova.compute.manager [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] Start building networks asynchronously for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 753.846727] env[59857]: DEBUG nova.compute.utils [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Using /dev/sd instead of None {{(pid=59857) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 753.846727] env[59857]: DEBUG nova.compute.manager [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] Allocating IP information in the background. {{(pid=59857) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 753.846727] env[59857]: DEBUG nova.network.neutron [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] allocate_for_instance() {{(pid=59857) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 753.853044] env[59857]: DEBUG nova.network.neutron [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] [instance: 963513ec-2280-475b-87a0-045df892e8b4] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 753.863275] env[59857]: DEBUG nova.compute.manager [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] Start building block device mappings for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 753.867027] env[59857]: DEBUG nova.network.neutron [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] [instance: 963513ec-2280-475b-87a0-045df892e8b4] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 753.887091] env[59857]: INFO nova.compute.manager [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] [instance: 963513ec-2280-475b-87a0-045df892e8b4] Took 0.21 seconds to deallocate network for instance. [ 753.994927] env[59857]: DEBUG nova.compute.manager [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] Start spawning the instance on the hypervisor. {{(pid=59857) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 753.997822] env[59857]: DEBUG nova.policy [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '52b81f22ae004f55a58accddbf06b161', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bb0823f51ed044b7ad68386fb1f60fb5', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59857) authorize /opt/stack/nova/nova/policy.py:203}} [ 754.020803] env[59857]: INFO nova.scheduler.client.report [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Deleted allocations for instance 963513ec-2280-475b-87a0-045df892e8b4 [ 754.034446] env[59857]: DEBUG nova.virt.hardware [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-23T14:43:14Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-23T14:42:57Z,direct_url=,disk_format='vmdk',id=4a4a4830-1ff7-4cff-ab75-d665942f46b5,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='88755ffc9d394d14928f0692771e61f5',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-23T14:42:58Z,virtual_size=,visibility=), allow threads: False {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 754.034737] env[59857]: DEBUG nova.virt.hardware [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Flavor limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 754.034924] env[59857]: DEBUG nova.virt.hardware [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Image limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 754.035151] env[59857]: DEBUG nova.virt.hardware [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Flavor pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 754.035311] env[59857]: DEBUG nova.virt.hardware [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Image pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 754.035791] env[59857]: DEBUG nova.virt.hardware [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 754.035791] env[59857]: DEBUG nova.virt.hardware [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 754.035924] env[59857]: DEBUG nova.virt.hardware [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 754.036119] env[59857]: DEBUG nova.virt.hardware [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Got 1 possible topologies {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 754.036922] env[59857]: DEBUG nova.virt.hardware [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 754.036922] env[59857]: DEBUG nova.virt.hardware [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 754.039776] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66ddcb9d-f49a-44ef-bda7-697fd4bce2bb {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 754.053529] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3c9cbde-4c7b-45d0-ba97-e70fbed7cb9f {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 754.058045] env[59857]: DEBUG oslo_concurrency.lockutils [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Lock "963513ec-2280-475b-87a0-045df892e8b4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 12.958s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 754.588730] env[59857]: DEBUG nova.network.neutron [req-98b485ad-0d21-4328-b8ec-68433deee095 req-e922b463-2547-417a-bec2-38dddaaa7bc8 service nova] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 754.601836] env[59857]: DEBUG oslo_concurrency.lockutils [req-98b485ad-0d21-4328-b8ec-68433deee095 req-e922b463-2547-417a-bec2-38dddaaa7bc8 service nova] Releasing lock "refresh_cache-f12524a7-21b9-4e35-b15b-955627d58c7a" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 754.601836] env[59857]: DEBUG oslo_concurrency.lockutils [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Acquired lock "refresh_cache-f12524a7-21b9-4e35-b15b-955627d58c7a" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 754.601836] env[59857]: DEBUG nova.network.neutron [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 754.611541] env[59857]: DEBUG nova.network.neutron [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] Successfully created port: 9d208dba-9a95-4330-8921-302664ac21ba {{(pid=59857) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 754.702973] env[59857]: DEBUG nova.network.neutron [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 755.107020] env[59857]: DEBUG nova.network.neutron [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] Successfully created port: 55fd5071-a52d-47ff-875a-8662e0df32fd {{(pid=59857) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 755.467054] env[59857]: DEBUG nova.network.neutron [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 755.475979] env[59857]: DEBUG oslo_concurrency.lockutils [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Releasing lock "refresh_cache-f12524a7-21b9-4e35-b15b-955627d58c7a" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 755.476425] env[59857]: DEBUG nova.compute.manager [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] Start destroying the instance on the hypervisor. {{(pid=59857) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 755.476654] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] Destroying instance {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 755.477192] env[59857]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-4f3d111d-65ab-4d73-9bc9-3c82c6ea2262 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 755.488748] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25f8b6dd-efcd-4781-9ba0-76f903b6f25c {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 755.523169] env[59857]: WARNING nova.virt.vmwareapi.vmops [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance f12524a7-21b9-4e35-b15b-955627d58c7a could not be found. [ 755.523439] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] Instance destroyed {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 755.523633] env[59857]: INFO nova.compute.manager [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] Took 0.05 seconds to destroy the instance on the hypervisor. [ 755.523883] env[59857]: DEBUG oslo.service.loopingcall [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59857) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 755.524469] env[59857]: DEBUG nova.compute.manager [-] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 755.524592] env[59857]: DEBUG nova.network.neutron [-] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 755.585707] env[59857]: DEBUG nova.network.neutron [-] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 755.594783] env[59857]: DEBUG nova.network.neutron [-] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 755.607194] env[59857]: INFO nova.compute.manager [-] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] Took 0.08 seconds to deallocate network for instance. [ 755.607432] env[59857]: DEBUG nova.compute.claims [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] Aborting claim: {{(pid=59857) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 755.607606] env[59857]: DEBUG oslo_concurrency.lockutils [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 755.607900] env[59857]: DEBUG oslo_concurrency.lockutils [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 755.801037] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ab60b97-5b60-495d-915f-029943922896 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 755.811355] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a5ab43c-c88e-44e9-8071-b0f76db2b2c5 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 755.847367] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4f371dd-9b23-4564-a2d8-451deffc285e {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 755.856913] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b0cad7d-e0b2-42fa-ab57-58547bf88245 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 755.875202] env[59857]: DEBUG nova.compute.provider_tree [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 755.888735] env[59857]: DEBUG nova.scheduler.client.report [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 755.920586] env[59857]: DEBUG oslo_concurrency.lockutils [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.309s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 755.920586] env[59857]: ERROR nova.compute.manager [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 67a8229f-3cad-45fd-8c40-d2c3e22b636a, please check neutron logs for more information. [ 755.920586] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] Traceback (most recent call last): [ 755.920586] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 755.920586] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] self.driver.spawn(context, instance, image_meta, [ 755.920586] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 755.920586] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 755.920586] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 755.920586] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] vm_ref = self.build_virtual_machine(instance, [ 755.921569] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 755.921569] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] vif_infos = vmwarevif.get_vif_info(self._session, [ 755.921569] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 755.921569] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] for vif in network_info: [ 755.921569] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 755.921569] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] return self._sync_wrapper(fn, *args, **kwargs) [ 755.921569] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 755.921569] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] self.wait() [ 755.921569] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 755.921569] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] self[:] = self._gt.wait() [ 755.921569] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 755.921569] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] return self._exit_event.wait() [ 755.921569] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 755.921569] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] result = hub.switch() [ 755.922363] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 755.922363] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] return self.greenlet.switch() [ 755.922363] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 755.922363] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] result = function(*args, **kwargs) [ 755.922363] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 755.922363] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] return func(*args, **kwargs) [ 755.922363] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 755.922363] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] raise e [ 755.922363] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 755.922363] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] nwinfo = self.network_api.allocate_for_instance( [ 755.922363] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 755.922363] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] created_port_ids = self._update_ports_for_instance( [ 755.922363] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 755.924293] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] with excutils.save_and_reraise_exception(): [ 755.924293] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 755.924293] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] self.force_reraise() [ 755.924293] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 755.924293] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] raise self.value [ 755.924293] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 755.924293] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] updated_port = self._update_port( [ 755.924293] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 755.924293] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] _ensure_no_port_binding_failure(port) [ 755.924293] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 755.924293] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] raise exception.PortBindingFailed(port_id=port['id']) [ 755.924293] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] nova.exception.PortBindingFailed: Binding failed for port 67a8229f-3cad-45fd-8c40-d2c3e22b636a, please check neutron logs for more information. [ 755.924293] env[59857]: ERROR nova.compute.manager [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] [ 755.924969] env[59857]: DEBUG nova.compute.utils [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] Binding failed for port 67a8229f-3cad-45fd-8c40-d2c3e22b636a, please check neutron logs for more information. {{(pid=59857) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 755.924969] env[59857]: DEBUG nova.compute.manager [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] Build of instance f12524a7-21b9-4e35-b15b-955627d58c7a was re-scheduled: Binding failed for port 67a8229f-3cad-45fd-8c40-d2c3e22b636a, please check neutron logs for more information. {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 755.925227] env[59857]: DEBUG nova.compute.manager [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] Unplugging VIFs for instance {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 755.925454] env[59857]: DEBUG oslo_concurrency.lockutils [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Acquiring lock "refresh_cache-f12524a7-21b9-4e35-b15b-955627d58c7a" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 755.926984] env[59857]: DEBUG oslo_concurrency.lockutils [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Acquired lock "refresh_cache-f12524a7-21b9-4e35-b15b-955627d58c7a" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 755.926984] env[59857]: DEBUG nova.network.neutron [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 756.032214] env[59857]: DEBUG nova.network.neutron [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 756.038404] env[59857]: DEBUG nova.network.neutron [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] Successfully created port: d2fae340-0a11-4474-8e26-d713b0ec1239 {{(pid=59857) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 756.511890] env[59857]: DEBUG nova.network.neutron [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 756.529022] env[59857]: DEBUG oslo_concurrency.lockutils [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Releasing lock "refresh_cache-f12524a7-21b9-4e35-b15b-955627d58c7a" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 756.529022] env[59857]: DEBUG nova.compute.manager [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 756.529022] env[59857]: DEBUG nova.compute.manager [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 756.529022] env[59857]: DEBUG nova.network.neutron [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 756.766556] env[59857]: DEBUG nova.network.neutron [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 756.773725] env[59857]: DEBUG nova.network.neutron [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 756.784368] env[59857]: INFO nova.compute.manager [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] Took 0.25 seconds to deallocate network for instance. [ 756.890309] env[59857]: INFO nova.scheduler.client.report [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Deleted allocations for instance f12524a7-21b9-4e35-b15b-955627d58c7a [ 756.911491] env[59857]: DEBUG oslo_concurrency.lockutils [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "f12524a7-21b9-4e35-b15b-955627d58c7a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 13.544s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 759.212776] env[59857]: DEBUG nova.compute.manager [req-5ee9c798-3a67-46b7-a48e-d4ee02764c2d req-f14db0cd-deeb-401e-80da-8de94a590f5e service nova] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] Received event network-changed-3c3207c4-43fc-434a-b522-b2b074acdf74 {{(pid=59857) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 759.213028] env[59857]: DEBUG nova.compute.manager [req-5ee9c798-3a67-46b7-a48e-d4ee02764c2d req-f14db0cd-deeb-401e-80da-8de94a590f5e service nova] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] Refreshing instance network info cache due to event network-changed-3c3207c4-43fc-434a-b522-b2b074acdf74. {{(pid=59857) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 759.213180] env[59857]: DEBUG oslo_concurrency.lockutils [req-5ee9c798-3a67-46b7-a48e-d4ee02764c2d req-f14db0cd-deeb-401e-80da-8de94a590f5e service nova] Acquiring lock "refresh_cache-507d89fc-2083-4575-9a9c-f7f350741ef3" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 759.213312] env[59857]: DEBUG oslo_concurrency.lockutils [req-5ee9c798-3a67-46b7-a48e-d4ee02764c2d req-f14db0cd-deeb-401e-80da-8de94a590f5e service nova] Acquired lock "refresh_cache-507d89fc-2083-4575-9a9c-f7f350741ef3" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 759.214587] env[59857]: DEBUG nova.network.neutron [req-5ee9c798-3a67-46b7-a48e-d4ee02764c2d req-f14db0cd-deeb-401e-80da-8de94a590f5e service nova] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] Refreshing network info cache for port 3c3207c4-43fc-434a-b522-b2b074acdf74 {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 759.344095] env[59857]: DEBUG nova.network.neutron [req-5ee9c798-3a67-46b7-a48e-d4ee02764c2d req-f14db0cd-deeb-401e-80da-8de94a590f5e service nova] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 759.644689] env[59857]: ERROR nova.compute.manager [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 55fd5071-a52d-47ff-875a-8662e0df32fd, please check neutron logs for more information. [ 759.644689] env[59857]: ERROR nova.compute.manager Traceback (most recent call last): [ 759.644689] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 759.644689] env[59857]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 759.644689] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 759.644689] env[59857]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 759.644689] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 759.644689] env[59857]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 759.644689] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 759.644689] env[59857]: ERROR nova.compute.manager self.force_reraise() [ 759.644689] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 759.644689] env[59857]: ERROR nova.compute.manager raise self.value [ 759.644689] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 759.644689] env[59857]: ERROR nova.compute.manager updated_port = self._update_port( [ 759.644689] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 759.644689] env[59857]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 759.645231] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 759.645231] env[59857]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 759.645231] env[59857]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 55fd5071-a52d-47ff-875a-8662e0df32fd, please check neutron logs for more information. [ 759.645231] env[59857]: ERROR nova.compute.manager [ 759.645231] env[59857]: Traceback (most recent call last): [ 759.645231] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 759.645231] env[59857]: listener.cb(fileno) [ 759.645231] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 759.645231] env[59857]: result = function(*args, **kwargs) [ 759.645231] env[59857]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 759.645231] env[59857]: return func(*args, **kwargs) [ 759.645231] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 759.645231] env[59857]: raise e [ 759.645231] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 759.645231] env[59857]: nwinfo = self.network_api.allocate_for_instance( [ 759.645231] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 759.645231] env[59857]: created_port_ids = self._update_ports_for_instance( [ 759.645231] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 759.645231] env[59857]: with excutils.save_and_reraise_exception(): [ 759.645231] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 759.645231] env[59857]: self.force_reraise() [ 759.645231] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 759.645231] env[59857]: raise self.value [ 759.645231] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 759.645231] env[59857]: updated_port = self._update_port( [ 759.645231] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 759.645231] env[59857]: _ensure_no_port_binding_failure(port) [ 759.645231] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 759.645231] env[59857]: raise exception.PortBindingFailed(port_id=port['id']) [ 759.645993] env[59857]: nova.exception.PortBindingFailed: Binding failed for port 55fd5071-a52d-47ff-875a-8662e0df32fd, please check neutron logs for more information. [ 759.645993] env[59857]: Removing descriptor: 17 [ 759.649633] env[59857]: ERROR nova.compute.manager [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 55fd5071-a52d-47ff-875a-8662e0df32fd, please check neutron logs for more information. [ 759.649633] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] Traceback (most recent call last): [ 759.649633] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 759.649633] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] yield resources [ 759.649633] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 759.649633] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] self.driver.spawn(context, instance, image_meta, [ 759.649633] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 759.649633] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] self._vmops.spawn(context, instance, image_meta, injected_files, [ 759.649633] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 759.649633] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] vm_ref = self.build_virtual_machine(instance, [ 759.649633] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 759.649986] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] vif_infos = vmwarevif.get_vif_info(self._session, [ 759.649986] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 759.649986] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] for vif in network_info: [ 759.649986] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 759.649986] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] return self._sync_wrapper(fn, *args, **kwargs) [ 759.649986] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 759.649986] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] self.wait() [ 759.649986] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 759.649986] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] self[:] = self._gt.wait() [ 759.649986] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 759.649986] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] return self._exit_event.wait() [ 759.649986] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 759.649986] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] result = hub.switch() [ 759.649986] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 759.650429] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] return self.greenlet.switch() [ 759.650429] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 759.650429] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] result = function(*args, **kwargs) [ 759.650429] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 759.650429] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] return func(*args, **kwargs) [ 759.650429] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 759.650429] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] raise e [ 759.650429] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 759.650429] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] nwinfo = self.network_api.allocate_for_instance( [ 759.650429] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 759.650429] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] created_port_ids = self._update_ports_for_instance( [ 759.650429] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 759.650429] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] with excutils.save_and_reraise_exception(): [ 759.650793] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 759.650793] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] self.force_reraise() [ 759.650793] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 759.650793] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] raise self.value [ 759.650793] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 759.650793] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] updated_port = self._update_port( [ 759.650793] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 759.650793] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] _ensure_no_port_binding_failure(port) [ 759.650793] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 759.650793] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] raise exception.PortBindingFailed(port_id=port['id']) [ 759.650793] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] nova.exception.PortBindingFailed: Binding failed for port 55fd5071-a52d-47ff-875a-8662e0df32fd, please check neutron logs for more information. [ 759.650793] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] [ 759.651404] env[59857]: INFO nova.compute.manager [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] Terminating instance [ 759.653621] env[59857]: DEBUG oslo_concurrency.lockutils [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Acquiring lock "refresh_cache-67b672b6-c6cb-4dc2-9d75-fb028195a0dd" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 759.653791] env[59857]: DEBUG oslo_concurrency.lockutils [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Acquired lock "refresh_cache-67b672b6-c6cb-4dc2-9d75-fb028195a0dd" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 759.653958] env[59857]: DEBUG nova.network.neutron [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 759.737721] env[59857]: DEBUG nova.network.neutron [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 759.791209] env[59857]: ERROR nova.compute.manager [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 3c3207c4-43fc-434a-b522-b2b074acdf74, please check neutron logs for more information. [ 759.791209] env[59857]: ERROR nova.compute.manager Traceback (most recent call last): [ 759.791209] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 759.791209] env[59857]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 759.791209] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 759.791209] env[59857]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 759.791209] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 759.791209] env[59857]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 759.791209] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 759.791209] env[59857]: ERROR nova.compute.manager self.force_reraise() [ 759.791209] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 759.791209] env[59857]: ERROR nova.compute.manager raise self.value [ 759.791209] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 759.791209] env[59857]: ERROR nova.compute.manager updated_port = self._update_port( [ 759.791209] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 759.791209] env[59857]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 759.791700] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 759.791700] env[59857]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 759.791700] env[59857]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 3c3207c4-43fc-434a-b522-b2b074acdf74, please check neutron logs for more information. [ 759.791700] env[59857]: ERROR nova.compute.manager [ 759.791700] env[59857]: Traceback (most recent call last): [ 759.791700] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 759.791700] env[59857]: listener.cb(fileno) [ 759.791700] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 759.791700] env[59857]: result = function(*args, **kwargs) [ 759.791700] env[59857]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 759.791700] env[59857]: return func(*args, **kwargs) [ 759.791700] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 759.791700] env[59857]: raise e [ 759.791700] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 759.791700] env[59857]: nwinfo = self.network_api.allocate_for_instance( [ 759.791700] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 759.791700] env[59857]: created_port_ids = self._update_ports_for_instance( [ 759.791700] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 759.791700] env[59857]: with excutils.save_and_reraise_exception(): [ 759.791700] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 759.791700] env[59857]: self.force_reraise() [ 759.791700] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 759.791700] env[59857]: raise self.value [ 759.791700] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 759.791700] env[59857]: updated_port = self._update_port( [ 759.791700] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 759.791700] env[59857]: _ensure_no_port_binding_failure(port) [ 759.791700] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 759.791700] env[59857]: raise exception.PortBindingFailed(port_id=port['id']) [ 759.792415] env[59857]: nova.exception.PortBindingFailed: Binding failed for port 3c3207c4-43fc-434a-b522-b2b074acdf74, please check neutron logs for more information. [ 759.792415] env[59857]: Removing descriptor: 15 [ 759.792415] env[59857]: ERROR nova.compute.manager [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 3c3207c4-43fc-434a-b522-b2b074acdf74, please check neutron logs for more information. [ 759.792415] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] Traceback (most recent call last): [ 759.792415] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 759.792415] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] yield resources [ 759.792415] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 759.792415] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] self.driver.spawn(context, instance, image_meta, [ 759.792415] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 759.792415] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 759.792415] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 759.792415] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] vm_ref = self.build_virtual_machine(instance, [ 759.792723] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 759.792723] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] vif_infos = vmwarevif.get_vif_info(self._session, [ 759.792723] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 759.792723] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] for vif in network_info: [ 759.792723] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 759.792723] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] return self._sync_wrapper(fn, *args, **kwargs) [ 759.792723] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 759.792723] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] self.wait() [ 759.792723] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 759.792723] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] self[:] = self._gt.wait() [ 759.792723] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 759.792723] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] return self._exit_event.wait() [ 759.792723] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 759.792723] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] result = hub.switch() [ 759.793073] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 759.793073] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] return self.greenlet.switch() [ 759.793073] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 759.793073] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] result = function(*args, **kwargs) [ 759.793073] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 759.793073] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] return func(*args, **kwargs) [ 759.793073] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 759.793073] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] raise e [ 759.793073] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 759.793073] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] nwinfo = self.network_api.allocate_for_instance( [ 759.793073] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 759.793073] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] created_port_ids = self._update_ports_for_instance( [ 759.793073] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 759.793401] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] with excutils.save_and_reraise_exception(): [ 759.793401] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 759.793401] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] self.force_reraise() [ 759.793401] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 759.793401] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] raise self.value [ 759.793401] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 759.793401] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] updated_port = self._update_port( [ 759.793401] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 759.793401] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] _ensure_no_port_binding_failure(port) [ 759.793401] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 759.793401] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] raise exception.PortBindingFailed(port_id=port['id']) [ 759.793401] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] nova.exception.PortBindingFailed: Binding failed for port 3c3207c4-43fc-434a-b522-b2b074acdf74, please check neutron logs for more information. [ 759.793401] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] [ 759.796961] env[59857]: INFO nova.compute.manager [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] Terminating instance [ 759.796961] env[59857]: DEBUG oslo_concurrency.lockutils [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "refresh_cache-507d89fc-2083-4575-9a9c-f7f350741ef3" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 759.827746] env[59857]: DEBUG nova.network.neutron [req-5ee9c798-3a67-46b7-a48e-d4ee02764c2d req-f14db0cd-deeb-401e-80da-8de94a590f5e service nova] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 759.843808] env[59857]: DEBUG oslo_concurrency.lockutils [req-5ee9c798-3a67-46b7-a48e-d4ee02764c2d req-f14db0cd-deeb-401e-80da-8de94a590f5e service nova] Releasing lock "refresh_cache-507d89fc-2083-4575-9a9c-f7f350741ef3" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 759.843808] env[59857]: DEBUG oslo_concurrency.lockutils [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquired lock "refresh_cache-507d89fc-2083-4575-9a9c-f7f350741ef3" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 759.843808] env[59857]: DEBUG nova.network.neutron [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 759.922405] env[59857]: DEBUG nova.network.neutron [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 760.250752] env[59857]: DEBUG nova.network.neutron [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 760.267443] env[59857]: DEBUG oslo_concurrency.lockutils [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Releasing lock "refresh_cache-67b672b6-c6cb-4dc2-9d75-fb028195a0dd" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 760.267443] env[59857]: DEBUG nova.compute.manager [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] Start destroying the instance on the hypervisor. {{(pid=59857) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 760.267443] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] Destroying instance {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 760.267443] env[59857]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-4781ea0a-ce59-4e73-a435-043766c9c0c5 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 760.279083] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93cd88a6-72ad-47c1-9bb5-353f63c992d0 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 760.311648] env[59857]: WARNING nova.virt.vmwareapi.vmops [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 67b672b6-c6cb-4dc2-9d75-fb028195a0dd could not be found. [ 760.311924] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] Instance destroyed {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 760.312077] env[59857]: INFO nova.compute.manager [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] Took 0.05 seconds to destroy the instance on the hypervisor. [ 760.312325] env[59857]: DEBUG oslo.service.loopingcall [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59857) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 760.313078] env[59857]: DEBUG nova.compute.manager [-] [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 760.313225] env[59857]: DEBUG nova.network.neutron [-] [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 760.344900] env[59857]: DEBUG nova.network.neutron [-] [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 760.354309] env[59857]: DEBUG nova.network.neutron [-] [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 760.365932] env[59857]: INFO nova.compute.manager [-] [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] Took 0.05 seconds to deallocate network for instance. [ 760.368381] env[59857]: DEBUG nova.compute.claims [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] Aborting claim: {{(pid=59857) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 760.368559] env[59857]: DEBUG oslo_concurrency.lockutils [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 760.368765] env[59857]: DEBUG oslo_concurrency.lockutils [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 760.468713] env[59857]: DEBUG nova.network.neutron [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 760.491326] env[59857]: DEBUG oslo_concurrency.lockutils [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Releasing lock "refresh_cache-507d89fc-2083-4575-9a9c-f7f350741ef3" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 760.491326] env[59857]: DEBUG nova.compute.manager [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] Start destroying the instance on the hypervisor. {{(pid=59857) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 760.491326] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] Destroying instance {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 760.491326] env[59857]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-87bd7f57-07bb-4991-8e6a-e4a4d0f0ad11 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 760.508155] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0448d5d0-73fd-42aa-b53e-4d45a1f64f5d {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 760.536686] env[59857]: WARNING nova.virt.vmwareapi.vmops [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 507d89fc-2083-4575-9a9c-f7f350741ef3 could not be found. [ 760.536951] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] Instance destroyed {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 760.537223] env[59857]: INFO nova.compute.manager [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] Took 0.05 seconds to destroy the instance on the hypervisor. [ 760.537480] env[59857]: DEBUG oslo.service.loopingcall [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59857) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 760.537768] env[59857]: DEBUG nova.compute.manager [-] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 760.537815] env[59857]: DEBUG nova.network.neutron [-] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 760.573825] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8208e05a-0d55-4105-9f2d-4ab6d18aa966 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 760.584701] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2727e4c-a7f6-45de-a519-16d69511051e {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 760.621673] env[59857]: DEBUG nova.network.neutron [-] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 760.627659] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ae5b879-ae6e-4fcd-aba6-3514fb2d4683 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 760.640649] env[59857]: DEBUG nova.network.neutron [-] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 760.649024] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5286747d-2068-439d-8508-7a62433aacd4 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 760.653901] env[59857]: INFO nova.compute.manager [-] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] Took 0.12 seconds to deallocate network for instance. [ 760.656098] env[59857]: DEBUG nova.compute.claims [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] Aborting claim: {{(pid=59857) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 760.656276] env[59857]: DEBUG oslo_concurrency.lockutils [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 760.672018] env[59857]: DEBUG nova.compute.provider_tree [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 760.680254] env[59857]: DEBUG nova.scheduler.client.report [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 760.699637] env[59857]: DEBUG oslo_concurrency.lockutils [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.331s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 760.700279] env[59857]: ERROR nova.compute.manager [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 55fd5071-a52d-47ff-875a-8662e0df32fd, please check neutron logs for more information. [ 760.700279] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] Traceback (most recent call last): [ 760.700279] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 760.700279] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] self.driver.spawn(context, instance, image_meta, [ 760.700279] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 760.700279] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] self._vmops.spawn(context, instance, image_meta, injected_files, [ 760.700279] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 760.700279] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] vm_ref = self.build_virtual_machine(instance, [ 760.700279] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 760.700279] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] vif_infos = vmwarevif.get_vif_info(self._session, [ 760.700279] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 760.700623] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] for vif in network_info: [ 760.700623] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 760.700623] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] return self._sync_wrapper(fn, *args, **kwargs) [ 760.700623] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 760.700623] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] self.wait() [ 760.700623] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 760.700623] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] self[:] = self._gt.wait() [ 760.700623] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 760.700623] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] return self._exit_event.wait() [ 760.700623] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 760.700623] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] result = hub.switch() [ 760.700623] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 760.700623] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] return self.greenlet.switch() [ 760.700623] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 760.700994] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] result = function(*args, **kwargs) [ 760.700994] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 760.700994] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] return func(*args, **kwargs) [ 760.700994] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 760.700994] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] raise e [ 760.700994] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 760.700994] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] nwinfo = self.network_api.allocate_for_instance( [ 760.700994] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 760.700994] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] created_port_ids = self._update_ports_for_instance( [ 760.700994] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 760.700994] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] with excutils.save_and_reraise_exception(): [ 760.700994] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 760.700994] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] self.force_reraise() [ 760.701342] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 760.701342] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] raise self.value [ 760.701342] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 760.701342] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] updated_port = self._update_port( [ 760.701342] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 760.701342] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] _ensure_no_port_binding_failure(port) [ 760.701342] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 760.701342] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] raise exception.PortBindingFailed(port_id=port['id']) [ 760.701342] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] nova.exception.PortBindingFailed: Binding failed for port 55fd5071-a52d-47ff-875a-8662e0df32fd, please check neutron logs for more information. [ 760.701342] env[59857]: ERROR nova.compute.manager [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] [ 760.701342] env[59857]: DEBUG nova.compute.utils [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] Binding failed for port 55fd5071-a52d-47ff-875a-8662e0df32fd, please check neutron logs for more information. {{(pid=59857) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 760.703908] env[59857]: DEBUG oslo_concurrency.lockutils [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.046s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 760.705930] env[59857]: DEBUG nova.compute.manager [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] Build of instance 67b672b6-c6cb-4dc2-9d75-fb028195a0dd was re-scheduled: Binding failed for port 55fd5071-a52d-47ff-875a-8662e0df32fd, please check neutron logs for more information. {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 760.706399] env[59857]: DEBUG nova.compute.manager [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] Unplugging VIFs for instance {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 760.706633] env[59857]: DEBUG oslo_concurrency.lockutils [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Acquiring lock "refresh_cache-67b672b6-c6cb-4dc2-9d75-fb028195a0dd" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 760.706817] env[59857]: DEBUG oslo_concurrency.lockutils [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Acquired lock "refresh_cache-67b672b6-c6cb-4dc2-9d75-fb028195a0dd" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 760.706976] env[59857]: DEBUG nova.network.neutron [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 760.809795] env[59857]: DEBUG nova.network.neutron [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 760.872955] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-45b252a1-3c8d-47fa-8ed0-740036706c14 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 760.886804] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db98f676-6e85-429e-9255-8bd4d4d1f718 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 760.924065] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61c382da-bc0d-43bb-8e94-20669ea33214 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 760.935942] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-931f4ef1-61c1-42ff-bc7f-b573d38d10e0 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 760.956121] env[59857]: DEBUG nova.compute.provider_tree [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 760.967799] env[59857]: DEBUG nova.scheduler.client.report [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 760.986952] env[59857]: DEBUG oslo_concurrency.lockutils [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.284s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 760.986952] env[59857]: ERROR nova.compute.manager [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 3c3207c4-43fc-434a-b522-b2b074acdf74, please check neutron logs for more information. [ 760.986952] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] Traceback (most recent call last): [ 760.986952] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 760.986952] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] self.driver.spawn(context, instance, image_meta, [ 760.986952] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 760.986952] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 760.986952] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 760.986952] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] vm_ref = self.build_virtual_machine(instance, [ 760.987212] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 760.987212] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] vif_infos = vmwarevif.get_vif_info(self._session, [ 760.987212] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 760.987212] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] for vif in network_info: [ 760.987212] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 760.987212] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] return self._sync_wrapper(fn, *args, **kwargs) [ 760.987212] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 760.987212] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] self.wait() [ 760.987212] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 760.987212] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] self[:] = self._gt.wait() [ 760.987212] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 760.987212] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] return self._exit_event.wait() [ 760.987212] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 760.987212] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] result = hub.switch() [ 760.987556] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 760.987556] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] return self.greenlet.switch() [ 760.987556] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 760.987556] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] result = function(*args, **kwargs) [ 760.987556] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 760.987556] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] return func(*args, **kwargs) [ 760.987556] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 760.987556] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] raise e [ 760.987556] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 760.987556] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] nwinfo = self.network_api.allocate_for_instance( [ 760.987556] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 760.987556] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] created_port_ids = self._update_ports_for_instance( [ 760.987556] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 760.987894] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] with excutils.save_and_reraise_exception(): [ 760.987894] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 760.987894] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] self.force_reraise() [ 760.987894] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 760.987894] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] raise self.value [ 760.987894] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 760.987894] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] updated_port = self._update_port( [ 760.987894] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 760.987894] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] _ensure_no_port_binding_failure(port) [ 760.987894] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 760.987894] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] raise exception.PortBindingFailed(port_id=port['id']) [ 760.987894] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] nova.exception.PortBindingFailed: Binding failed for port 3c3207c4-43fc-434a-b522-b2b074acdf74, please check neutron logs for more information. [ 760.987894] env[59857]: ERROR nova.compute.manager [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] [ 760.988277] env[59857]: DEBUG nova.compute.utils [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] Binding failed for port 3c3207c4-43fc-434a-b522-b2b074acdf74, please check neutron logs for more information. {{(pid=59857) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 760.989190] env[59857]: DEBUG nova.compute.manager [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] Build of instance 507d89fc-2083-4575-9a9c-f7f350741ef3 was re-scheduled: Binding failed for port 3c3207c4-43fc-434a-b522-b2b074acdf74, please check neutron logs for more information. {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 760.990054] env[59857]: DEBUG nova.compute.manager [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] Unplugging VIFs for instance {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 760.990277] env[59857]: DEBUG oslo_concurrency.lockutils [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "refresh_cache-507d89fc-2083-4575-9a9c-f7f350741ef3" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 760.990411] env[59857]: DEBUG oslo_concurrency.lockutils [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquired lock "refresh_cache-507d89fc-2083-4575-9a9c-f7f350741ef3" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 760.990561] env[59857]: DEBUG nova.network.neutron [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 761.060078] env[59857]: DEBUG nova.network.neutron [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 761.095016] env[59857]: DEBUG nova.network.neutron [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 761.113077] env[59857]: DEBUG oslo_concurrency.lockutils [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Releasing lock "refresh_cache-67b672b6-c6cb-4dc2-9d75-fb028195a0dd" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 761.113323] env[59857]: DEBUG nova.compute.manager [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 761.113502] env[59857]: DEBUG nova.compute.manager [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 761.113667] env[59857]: DEBUG nova.network.neutron [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 761.167758] env[59857]: DEBUG nova.network.neutron [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 761.174120] env[59857]: ERROR nova.compute.manager [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 9d208dba-9a95-4330-8921-302664ac21ba, please check neutron logs for more information. [ 761.174120] env[59857]: ERROR nova.compute.manager Traceback (most recent call last): [ 761.174120] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 761.174120] env[59857]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 761.174120] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 761.174120] env[59857]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 761.174120] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 761.174120] env[59857]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 761.174120] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 761.174120] env[59857]: ERROR nova.compute.manager self.force_reraise() [ 761.174120] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 761.174120] env[59857]: ERROR nova.compute.manager raise self.value [ 761.174120] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 761.174120] env[59857]: ERROR nova.compute.manager updated_port = self._update_port( [ 761.174120] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 761.174120] env[59857]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 761.174614] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 761.174614] env[59857]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 761.174614] env[59857]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 9d208dba-9a95-4330-8921-302664ac21ba, please check neutron logs for more information. [ 761.174614] env[59857]: ERROR nova.compute.manager [ 761.174614] env[59857]: Traceback (most recent call last): [ 761.174614] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 761.174614] env[59857]: listener.cb(fileno) [ 761.174614] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 761.174614] env[59857]: result = function(*args, **kwargs) [ 761.174614] env[59857]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 761.174614] env[59857]: return func(*args, **kwargs) [ 761.174614] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 761.174614] env[59857]: raise e [ 761.174614] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 761.174614] env[59857]: nwinfo = self.network_api.allocate_for_instance( [ 761.174614] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 761.174614] env[59857]: created_port_ids = self._update_ports_for_instance( [ 761.174614] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 761.174614] env[59857]: with excutils.save_and_reraise_exception(): [ 761.174614] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 761.174614] env[59857]: self.force_reraise() [ 761.174614] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 761.174614] env[59857]: raise self.value [ 761.174614] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 761.174614] env[59857]: updated_port = self._update_port( [ 761.174614] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 761.174614] env[59857]: _ensure_no_port_binding_failure(port) [ 761.174614] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 761.174614] env[59857]: raise exception.PortBindingFailed(port_id=port['id']) [ 761.175352] env[59857]: nova.exception.PortBindingFailed: Binding failed for port 9d208dba-9a95-4330-8921-302664ac21ba, please check neutron logs for more information. [ 761.175352] env[59857]: Removing descriptor: 19 [ 761.175352] env[59857]: ERROR nova.compute.manager [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 9d208dba-9a95-4330-8921-302664ac21ba, please check neutron logs for more information. [ 761.175352] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] Traceback (most recent call last): [ 761.175352] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 761.175352] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] yield resources [ 761.175352] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 761.175352] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] self.driver.spawn(context, instance, image_meta, [ 761.175352] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 761.175352] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 761.175352] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 761.175352] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] vm_ref = self.build_virtual_machine(instance, [ 761.175663] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 761.175663] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] vif_infos = vmwarevif.get_vif_info(self._session, [ 761.175663] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 761.175663] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] for vif in network_info: [ 761.175663] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 761.175663] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] return self._sync_wrapper(fn, *args, **kwargs) [ 761.175663] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 761.175663] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] self.wait() [ 761.175663] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 761.175663] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] self[:] = self._gt.wait() [ 761.175663] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 761.175663] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] return self._exit_event.wait() [ 761.175663] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 761.175663] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] result = hub.switch() [ 761.176030] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 761.176030] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] return self.greenlet.switch() [ 761.176030] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 761.176030] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] result = function(*args, **kwargs) [ 761.176030] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 761.176030] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] return func(*args, **kwargs) [ 761.176030] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 761.176030] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] raise e [ 761.176030] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 761.176030] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] nwinfo = self.network_api.allocate_for_instance( [ 761.176030] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 761.176030] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] created_port_ids = self._update_ports_for_instance( [ 761.176030] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 761.176345] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] with excutils.save_and_reraise_exception(): [ 761.176345] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 761.176345] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] self.force_reraise() [ 761.176345] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 761.176345] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] raise self.value [ 761.176345] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 761.176345] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] updated_port = self._update_port( [ 761.176345] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 761.176345] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] _ensure_no_port_binding_failure(port) [ 761.176345] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 761.176345] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] raise exception.PortBindingFailed(port_id=port['id']) [ 761.176345] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] nova.exception.PortBindingFailed: Binding failed for port 9d208dba-9a95-4330-8921-302664ac21ba, please check neutron logs for more information. [ 761.176345] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] [ 761.176736] env[59857]: INFO nova.compute.manager [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] Terminating instance [ 761.178535] env[59857]: DEBUG oslo_concurrency.lockutils [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Acquiring lock "refresh_cache-f3b9789c-ecc6-4b1a-96ec-2c71dba363f7" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 761.178763] env[59857]: DEBUG oslo_concurrency.lockutils [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Acquired lock "refresh_cache-f3b9789c-ecc6-4b1a-96ec-2c71dba363f7" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 761.178984] env[59857]: DEBUG nova.network.neutron [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 761.180479] env[59857]: DEBUG nova.network.neutron [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 761.194268] env[59857]: INFO nova.compute.manager [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] Took 0.08 seconds to deallocate network for instance. [ 761.217731] env[59857]: DEBUG nova.network.neutron [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 761.307839] env[59857]: INFO nova.scheduler.client.report [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Deleted allocations for instance 67b672b6-c6cb-4dc2-9d75-fb028195a0dd [ 761.336145] env[59857]: DEBUG oslo_concurrency.lockutils [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Lock "67b672b6-c6cb-4dc2-9d75-fb028195a0dd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 7.989s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 761.582287] env[59857]: DEBUG nova.network.neutron [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 761.600150] env[59857]: DEBUG oslo_concurrency.lockutils [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Releasing lock "refresh_cache-f3b9789c-ecc6-4b1a-96ec-2c71dba363f7" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 761.600597] env[59857]: DEBUG nova.compute.manager [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] Start destroying the instance on the hypervisor. {{(pid=59857) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 761.600799] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] Destroying instance {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 761.601809] env[59857]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-1ace8769-ac72-4945-a582-9c143cc95032 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 761.614581] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02b9c7d0-0773-440c-b01a-00ed9d624443 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 761.650282] env[59857]: WARNING nova.virt.vmwareapi.vmops [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance f3b9789c-ecc6-4b1a-96ec-2c71dba363f7 could not be found. [ 761.650282] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] Instance destroyed {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 761.650282] env[59857]: INFO nova.compute.manager [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] Took 0.05 seconds to destroy the instance on the hypervisor. [ 761.650472] env[59857]: DEBUG oslo.service.loopingcall [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59857) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 761.650689] env[59857]: DEBUG nova.compute.manager [-] [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 761.650785] env[59857]: DEBUG nova.network.neutron [-] [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 761.679236] env[59857]: DEBUG nova.network.neutron [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 761.693505] env[59857]: DEBUG nova.network.neutron [-] [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 761.697954] env[59857]: DEBUG oslo_concurrency.lockutils [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Releasing lock "refresh_cache-507d89fc-2083-4575-9a9c-f7f350741ef3" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 761.698074] env[59857]: DEBUG nova.compute.manager [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 761.698299] env[59857]: DEBUG nova.compute.manager [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 761.698466] env[59857]: DEBUG nova.network.neutron [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 761.713276] env[59857]: DEBUG nova.network.neutron [-] [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 761.723310] env[59857]: INFO nova.compute.manager [-] [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] Took 0.07 seconds to deallocate network for instance. [ 761.725539] env[59857]: DEBUG nova.compute.claims [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] Aborting claim: {{(pid=59857) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 761.726457] env[59857]: DEBUG oslo_concurrency.lockutils [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 761.726457] env[59857]: DEBUG oslo_concurrency.lockutils [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 761.750555] env[59857]: DEBUG nova.network.neutron [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 761.765950] env[59857]: DEBUG nova.network.neutron [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 761.780124] env[59857]: INFO nova.compute.manager [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] Took 0.08 seconds to deallocate network for instance. [ 761.806170] env[59857]: ERROR nova.compute.manager [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port d2fae340-0a11-4474-8e26-d713b0ec1239, please check neutron logs for more information. [ 761.806170] env[59857]: ERROR nova.compute.manager Traceback (most recent call last): [ 761.806170] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 761.806170] env[59857]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 761.806170] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 761.806170] env[59857]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 761.806170] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 761.806170] env[59857]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 761.806170] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 761.806170] env[59857]: ERROR nova.compute.manager self.force_reraise() [ 761.806170] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 761.806170] env[59857]: ERROR nova.compute.manager raise self.value [ 761.806170] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 761.806170] env[59857]: ERROR nova.compute.manager updated_port = self._update_port( [ 761.806170] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 761.806170] env[59857]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 761.806745] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 761.806745] env[59857]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 761.806745] env[59857]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port d2fae340-0a11-4474-8e26-d713b0ec1239, please check neutron logs for more information. [ 761.806745] env[59857]: ERROR nova.compute.manager [ 761.806745] env[59857]: Traceback (most recent call last): [ 761.806745] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 761.806745] env[59857]: listener.cb(fileno) [ 761.806745] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 761.806745] env[59857]: result = function(*args, **kwargs) [ 761.806745] env[59857]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 761.806745] env[59857]: return func(*args, **kwargs) [ 761.806745] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 761.806745] env[59857]: raise e [ 761.806745] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 761.806745] env[59857]: nwinfo = self.network_api.allocate_for_instance( [ 761.806745] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 761.806745] env[59857]: created_port_ids = self._update_ports_for_instance( [ 761.806745] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 761.806745] env[59857]: with excutils.save_and_reraise_exception(): [ 761.806745] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 761.806745] env[59857]: self.force_reraise() [ 761.806745] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 761.806745] env[59857]: raise self.value [ 761.806745] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 761.806745] env[59857]: updated_port = self._update_port( [ 761.806745] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 761.806745] env[59857]: _ensure_no_port_binding_failure(port) [ 761.806745] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 761.806745] env[59857]: raise exception.PortBindingFailed(port_id=port['id']) [ 761.809393] env[59857]: nova.exception.PortBindingFailed: Binding failed for port d2fae340-0a11-4474-8e26-d713b0ec1239, please check neutron logs for more information. [ 761.809393] env[59857]: Removing descriptor: 12 [ 761.809393] env[59857]: ERROR nova.compute.manager [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port d2fae340-0a11-4474-8e26-d713b0ec1239, please check neutron logs for more information. [ 761.809393] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] Traceback (most recent call last): [ 761.809393] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 761.809393] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] yield resources [ 761.809393] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 761.809393] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] self.driver.spawn(context, instance, image_meta, [ 761.809393] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 761.809393] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 761.809393] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 761.809393] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] vm_ref = self.build_virtual_machine(instance, [ 761.809886] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 761.809886] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] vif_infos = vmwarevif.get_vif_info(self._session, [ 761.809886] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 761.809886] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] for vif in network_info: [ 761.809886] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 761.809886] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] return self._sync_wrapper(fn, *args, **kwargs) [ 761.809886] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 761.809886] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] self.wait() [ 761.809886] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 761.809886] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] self[:] = self._gt.wait() [ 761.809886] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 761.809886] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] return self._exit_event.wait() [ 761.809886] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 761.809886] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] result = hub.switch() [ 761.813265] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 761.813265] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] return self.greenlet.switch() [ 761.813265] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 761.813265] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] result = function(*args, **kwargs) [ 761.813265] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 761.813265] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] return func(*args, **kwargs) [ 761.813265] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 761.813265] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] raise e [ 761.813265] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 761.813265] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] nwinfo = self.network_api.allocate_for_instance( [ 761.813265] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 761.813265] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] created_port_ids = self._update_ports_for_instance( [ 761.813265] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 761.813638] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] with excutils.save_and_reraise_exception(): [ 761.813638] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 761.813638] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] self.force_reraise() [ 761.813638] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 761.813638] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] raise self.value [ 761.813638] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 761.813638] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] updated_port = self._update_port( [ 761.813638] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 761.813638] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] _ensure_no_port_binding_failure(port) [ 761.813638] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 761.813638] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] raise exception.PortBindingFailed(port_id=port['id']) [ 761.813638] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] nova.exception.PortBindingFailed: Binding failed for port d2fae340-0a11-4474-8e26-d713b0ec1239, please check neutron logs for more information. [ 761.813638] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] [ 761.813960] env[59857]: INFO nova.compute.manager [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] Terminating instance [ 761.813960] env[59857]: DEBUG oslo_concurrency.lockutils [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Acquiring lock "refresh_cache-3835af93-7a47-4d3c-9296-256aadddc3b3" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 761.813960] env[59857]: DEBUG oslo_concurrency.lockutils [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Acquired lock "refresh_cache-3835af93-7a47-4d3c-9296-256aadddc3b3" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 761.813960] env[59857]: DEBUG nova.network.neutron [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 761.853213] env[59857]: DEBUG nova.network.neutron [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 761.922957] env[59857]: INFO nova.scheduler.client.report [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Deleted allocations for instance 507d89fc-2083-4575-9a9c-f7f350741ef3 [ 761.955651] env[59857]: DEBUG oslo_concurrency.lockutils [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "507d89fc-2083-4575-9a9c-f7f350741ef3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 12.221s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 761.960370] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-edefdfb0-8c8b-46a3-ad39-fd7b6972f86c {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 761.970834] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ebbe2d07-f78d-467d-98ca-3bab3a252028 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 762.026322] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ada7f8eb-d3af-41a4-ab22-560602f4f42a {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 762.035652] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1f80201-9dbe-4840-9354-edf50b9b42d4 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 762.052095] env[59857]: DEBUG nova.compute.provider_tree [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 762.062768] env[59857]: DEBUG nova.scheduler.client.report [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 762.077496] env[59857]: DEBUG oslo_concurrency.lockutils [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.351s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 762.078154] env[59857]: ERROR nova.compute.manager [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 9d208dba-9a95-4330-8921-302664ac21ba, please check neutron logs for more information. [ 762.078154] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] Traceback (most recent call last): [ 762.078154] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 762.078154] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] self.driver.spawn(context, instance, image_meta, [ 762.078154] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 762.078154] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 762.078154] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 762.078154] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] vm_ref = self.build_virtual_machine(instance, [ 762.078154] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 762.078154] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] vif_infos = vmwarevif.get_vif_info(self._session, [ 762.078154] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 762.078513] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] for vif in network_info: [ 762.078513] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 762.078513] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] return self._sync_wrapper(fn, *args, **kwargs) [ 762.078513] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 762.078513] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] self.wait() [ 762.078513] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 762.078513] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] self[:] = self._gt.wait() [ 762.078513] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 762.078513] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] return self._exit_event.wait() [ 762.078513] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 762.078513] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] result = hub.switch() [ 762.078513] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 762.078513] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] return self.greenlet.switch() [ 762.078513] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 762.078912] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] result = function(*args, **kwargs) [ 762.078912] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 762.078912] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] return func(*args, **kwargs) [ 762.078912] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 762.078912] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] raise e [ 762.078912] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 762.078912] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] nwinfo = self.network_api.allocate_for_instance( [ 762.078912] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 762.078912] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] created_port_ids = self._update_ports_for_instance( [ 762.078912] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 762.078912] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] with excutils.save_and_reraise_exception(): [ 762.078912] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 762.078912] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] self.force_reraise() [ 762.079316] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 762.079316] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] raise self.value [ 762.079316] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 762.079316] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] updated_port = self._update_port( [ 762.079316] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 762.079316] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] _ensure_no_port_binding_failure(port) [ 762.079316] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 762.079316] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] raise exception.PortBindingFailed(port_id=port['id']) [ 762.079316] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] nova.exception.PortBindingFailed: Binding failed for port 9d208dba-9a95-4330-8921-302664ac21ba, please check neutron logs for more information. [ 762.079316] env[59857]: ERROR nova.compute.manager [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] [ 762.079581] env[59857]: DEBUG nova.compute.utils [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] Binding failed for port 9d208dba-9a95-4330-8921-302664ac21ba, please check neutron logs for more information. {{(pid=59857) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 762.081806] env[59857]: DEBUG nova.compute.manager [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] Build of instance f3b9789c-ecc6-4b1a-96ec-2c71dba363f7 was re-scheduled: Binding failed for port 9d208dba-9a95-4330-8921-302664ac21ba, please check neutron logs for more information. {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 762.081806] env[59857]: DEBUG nova.compute.manager [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] Unplugging VIFs for instance {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 762.081806] env[59857]: DEBUG oslo_concurrency.lockutils [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Acquiring lock "refresh_cache-f3b9789c-ecc6-4b1a-96ec-2c71dba363f7" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 762.081806] env[59857]: DEBUG oslo_concurrency.lockutils [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Acquired lock "refresh_cache-f3b9789c-ecc6-4b1a-96ec-2c71dba363f7" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 762.085205] env[59857]: DEBUG nova.network.neutron [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 762.093917] env[59857]: DEBUG nova.compute.manager [req-96a14665-e7c6-4017-904d-b603a0076e4f req-7ced79a1-2970-4740-a998-26cda0d974d2 service nova] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] Received event network-changed-d2fae340-0a11-4474-8e26-d713b0ec1239 {{(pid=59857) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 762.094099] env[59857]: DEBUG nova.compute.manager [req-96a14665-e7c6-4017-904d-b603a0076e4f req-7ced79a1-2970-4740-a998-26cda0d974d2 service nova] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] Refreshing instance network info cache due to event network-changed-d2fae340-0a11-4474-8e26-d713b0ec1239. {{(pid=59857) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 762.094278] env[59857]: DEBUG oslo_concurrency.lockutils [req-96a14665-e7c6-4017-904d-b603a0076e4f req-7ced79a1-2970-4740-a998-26cda0d974d2 service nova] Acquiring lock "refresh_cache-3835af93-7a47-4d3c-9296-256aadddc3b3" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 762.135382] env[59857]: DEBUG nova.network.neutron [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 762.138323] env[59857]: DEBUG nova.network.neutron [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 762.147030] env[59857]: DEBUG oslo_concurrency.lockutils [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Releasing lock "refresh_cache-3835af93-7a47-4d3c-9296-256aadddc3b3" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 762.147138] env[59857]: DEBUG nova.compute.manager [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] Start destroying the instance on the hypervisor. {{(pid=59857) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 762.147261] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] Destroying instance {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 762.147563] env[59857]: DEBUG oslo_concurrency.lockutils [req-96a14665-e7c6-4017-904d-b603a0076e4f req-7ced79a1-2970-4740-a998-26cda0d974d2 service nova] Acquired lock "refresh_cache-3835af93-7a47-4d3c-9296-256aadddc3b3" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 762.147728] env[59857]: DEBUG nova.network.neutron [req-96a14665-e7c6-4017-904d-b603a0076e4f req-7ced79a1-2970-4740-a998-26cda0d974d2 service nova] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] Refreshing network info cache for port d2fae340-0a11-4474-8e26-d713b0ec1239 {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 762.149188] env[59857]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-ac3a5ecf-7297-43f4-8e6f-fafa33ee3d9a {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 762.162325] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7bfa8865-4108-4ac5-b152-4ccb15d58413 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 762.192480] env[59857]: WARNING nova.virt.vmwareapi.vmops [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 3835af93-7a47-4d3c-9296-256aadddc3b3 could not be found. [ 762.192805] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] Instance destroyed {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 762.193068] env[59857]: INFO nova.compute.manager [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] Took 0.05 seconds to destroy the instance on the hypervisor. [ 762.193302] env[59857]: DEBUG oslo.service.loopingcall [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59857) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 762.194440] env[59857]: DEBUG nova.network.neutron [req-96a14665-e7c6-4017-904d-b603a0076e4f req-7ced79a1-2970-4740-a998-26cda0d974d2 service nova] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 762.196353] env[59857]: DEBUG nova.compute.manager [-] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 762.196353] env[59857]: DEBUG nova.network.neutron [-] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 762.253232] env[59857]: DEBUG nova.network.neutron [-] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 762.265304] env[59857]: DEBUG nova.network.neutron [-] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 762.276902] env[59857]: INFO nova.compute.manager [-] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] Took 0.08 seconds to deallocate network for instance. [ 762.279333] env[59857]: DEBUG nova.compute.claims [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] Aborting claim: {{(pid=59857) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 762.279552] env[59857]: DEBUG oslo_concurrency.lockutils [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 762.280009] env[59857]: DEBUG oslo_concurrency.lockutils [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 762.290647] env[59857]: DEBUG nova.network.neutron [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 762.305984] env[59857]: DEBUG oslo_concurrency.lockutils [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Releasing lock "refresh_cache-f3b9789c-ecc6-4b1a-96ec-2c71dba363f7" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 762.305984] env[59857]: DEBUG nova.compute.manager [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 762.305984] env[59857]: DEBUG nova.compute.manager [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 762.306105] env[59857]: DEBUG nova.network.neutron [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 762.347952] env[59857]: DEBUG nova.network.neutron [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 762.351465] env[59857]: DEBUG nova.network.neutron [req-96a14665-e7c6-4017-904d-b603a0076e4f req-7ced79a1-2970-4740-a998-26cda0d974d2 service nova] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 762.362315] env[59857]: DEBUG nova.network.neutron [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 762.367750] env[59857]: DEBUG oslo_concurrency.lockutils [req-96a14665-e7c6-4017-904d-b603a0076e4f req-7ced79a1-2970-4740-a998-26cda0d974d2 service nova] Releasing lock "refresh_cache-3835af93-7a47-4d3c-9296-256aadddc3b3" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 762.372381] env[59857]: INFO nova.compute.manager [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] Took 0.07 seconds to deallocate network for instance. [ 762.437406] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88782bf7-8914-4b71-8cf8-6053810293d5 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 762.448960] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dfc13c09-abef-4ba6-b2ba-d849655383ec {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 762.483849] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be065a4c-4f2f-4516-9b02-20b1d2640ace {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 762.495267] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-267169ac-1789-4f92-b13f-5d35d80e6cec {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 762.512504] env[59857]: DEBUG nova.compute.provider_tree [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 762.515188] env[59857]: INFO nova.scheduler.client.report [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Deleted allocations for instance f3b9789c-ecc6-4b1a-96ec-2c71dba363f7 [ 762.531733] env[59857]: DEBUG nova.scheduler.client.report [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 762.541630] env[59857]: DEBUG oslo_concurrency.lockutils [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Lock "f3b9789c-ecc6-4b1a-96ec-2c71dba363f7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 11.103s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 762.572017] env[59857]: DEBUG oslo_concurrency.lockutils [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.292s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 762.573238] env[59857]: ERROR nova.compute.manager [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port d2fae340-0a11-4474-8e26-d713b0ec1239, please check neutron logs for more information. [ 762.573238] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] Traceback (most recent call last): [ 762.573238] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 762.573238] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] self.driver.spawn(context, instance, image_meta, [ 762.573238] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 762.573238] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 762.573238] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 762.573238] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] vm_ref = self.build_virtual_machine(instance, [ 762.573238] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 762.573238] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] vif_infos = vmwarevif.get_vif_info(self._session, [ 762.573238] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 762.573590] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] for vif in network_info: [ 762.573590] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 762.573590] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] return self._sync_wrapper(fn, *args, **kwargs) [ 762.573590] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 762.573590] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] self.wait() [ 762.573590] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 762.573590] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] self[:] = self._gt.wait() [ 762.573590] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 762.573590] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] return self._exit_event.wait() [ 762.573590] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 762.573590] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] result = hub.switch() [ 762.573590] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 762.573590] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] return self.greenlet.switch() [ 762.573590] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 762.573970] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] result = function(*args, **kwargs) [ 762.573970] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 762.573970] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] return func(*args, **kwargs) [ 762.573970] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 762.573970] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] raise e [ 762.573970] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 762.573970] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] nwinfo = self.network_api.allocate_for_instance( [ 762.573970] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 762.573970] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] created_port_ids = self._update_ports_for_instance( [ 762.573970] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 762.573970] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] with excutils.save_and_reraise_exception(): [ 762.573970] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 762.573970] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] self.force_reraise() [ 762.574346] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 762.574346] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] raise self.value [ 762.574346] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 762.574346] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] updated_port = self._update_port( [ 762.574346] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 762.574346] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] _ensure_no_port_binding_failure(port) [ 762.574346] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 762.574346] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] raise exception.PortBindingFailed(port_id=port['id']) [ 762.574346] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] nova.exception.PortBindingFailed: Binding failed for port d2fae340-0a11-4474-8e26-d713b0ec1239, please check neutron logs for more information. [ 762.574346] env[59857]: ERROR nova.compute.manager [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] [ 762.578304] env[59857]: DEBUG nova.compute.utils [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] Binding failed for port d2fae340-0a11-4474-8e26-d713b0ec1239, please check neutron logs for more information. {{(pid=59857) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 762.579901] env[59857]: DEBUG nova.compute.manager [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] Build of instance 3835af93-7a47-4d3c-9296-256aadddc3b3 was re-scheduled: Binding failed for port d2fae340-0a11-4474-8e26-d713b0ec1239, please check neutron logs for more information. {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 762.579901] env[59857]: DEBUG nova.compute.manager [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] Unplugging VIFs for instance {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 762.580079] env[59857]: DEBUG oslo_concurrency.lockutils [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Acquiring lock "refresh_cache-3835af93-7a47-4d3c-9296-256aadddc3b3" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 762.580239] env[59857]: DEBUG oslo_concurrency.lockutils [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Acquired lock "refresh_cache-3835af93-7a47-4d3c-9296-256aadddc3b3" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 762.580472] env[59857]: DEBUG nova.network.neutron [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 762.644202] env[59857]: DEBUG nova.network.neutron [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 762.787491] env[59857]: DEBUG nova.network.neutron [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 762.828682] env[59857]: DEBUG oslo_concurrency.lockutils [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Releasing lock "refresh_cache-3835af93-7a47-4d3c-9296-256aadddc3b3" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 762.828909] env[59857]: DEBUG nova.compute.manager [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 762.829107] env[59857]: DEBUG nova.compute.manager [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 762.829278] env[59857]: DEBUG nova.network.neutron [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 762.859406] env[59857]: DEBUG nova.network.neutron [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 762.869215] env[59857]: DEBUG nova.network.neutron [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 762.879987] env[59857]: INFO nova.compute.manager [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] Took 0.05 seconds to deallocate network for instance. [ 762.976532] env[59857]: INFO nova.scheduler.client.report [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Deleted allocations for instance 3835af93-7a47-4d3c-9296-256aadddc3b3 [ 763.000425] env[59857]: DEBUG oslo_concurrency.lockutils [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Lock "3835af93-7a47-4d3c-9296-256aadddc3b3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 10.342s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 763.906446] env[59857]: WARNING oslo_vmware.rw_handles [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 763.906446] env[59857]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 763.906446] env[59857]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 763.906446] env[59857]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 763.906446] env[59857]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 763.906446] env[59857]: ERROR oslo_vmware.rw_handles response.begin() [ 763.906446] env[59857]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 763.906446] env[59857]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 763.906446] env[59857]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 763.906446] env[59857]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 763.906446] env[59857]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 763.906446] env[59857]: ERROR oslo_vmware.rw_handles [ 763.906446] env[59857]: DEBUG nova.virt.vmwareapi.images [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Downloaded image file data 4a4a4830-1ff7-4cff-ab75-d665942f46b5 to vmware_temp/8c5861da-de71-4c66-a091-67682bd79c41/4a4a4830-1ff7-4cff-ab75-d665942f46b5/tmp-sparse.vmdk on the data store datastore2 {{(pid=59857) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 763.907176] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Caching image {{(pid=59857) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 763.907176] env[59857]: DEBUG nova.virt.vmwareapi.vm_util [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Copying Virtual Disk [datastore2] vmware_temp/8c5861da-de71-4c66-a091-67682bd79c41/4a4a4830-1ff7-4cff-ab75-d665942f46b5/tmp-sparse.vmdk to [datastore2] vmware_temp/8c5861da-de71-4c66-a091-67682bd79c41/4a4a4830-1ff7-4cff-ab75-d665942f46b5/4a4a4830-1ff7-4cff-ab75-d665942f46b5.vmdk {{(pid=59857) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 763.907274] env[59857]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-cab14fa5-0289-449d-a568-c0d00df82791 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 763.919261] env[59857]: DEBUG oslo_vmware.api [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Waiting for the task: (returnval){ [ 763.919261] env[59857]: value = "task-1341445" [ 763.919261] env[59857]: _type = "Task" [ 763.919261] env[59857]: } to complete. {{(pid=59857) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 763.928104] env[59857]: DEBUG oslo_vmware.api [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Task: {'id': task-1341445, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59857) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 764.431893] env[59857]: DEBUG oslo_vmware.exceptions [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Fault InvalidArgument not matched. {{(pid=59857) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 764.432176] env[59857]: DEBUG oslo_concurrency.lockutils [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Releasing lock "[datastore2] devstack-image-cache_base/4a4a4830-1ff7-4cff-ab75-d665942f46b5/4a4a4830-1ff7-4cff-ab75-d665942f46b5.vmdk" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 764.436510] env[59857]: ERROR nova.compute.manager [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 764.436510] env[59857]: Faults: ['InvalidArgument'] [ 764.436510] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Traceback (most recent call last): [ 764.436510] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 764.436510] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] yield resources [ 764.436510] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 764.436510] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] self.driver.spawn(context, instance, image_meta, [ 764.436510] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 764.436510] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] self._vmops.spawn(context, instance, image_meta, injected_files, [ 764.436510] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 764.436510] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] self._fetch_image_if_missing(context, vi) [ 764.436510] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 764.436979] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] image_cache(vi, tmp_image_ds_loc) [ 764.436979] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 764.436979] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] vm_util.copy_virtual_disk( [ 764.436979] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 764.436979] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] session._wait_for_task(vmdk_copy_task) [ 764.436979] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 764.436979] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] return self.wait_for_task(task_ref) [ 764.436979] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 764.436979] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] return evt.wait() [ 764.436979] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 764.436979] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] result = hub.switch() [ 764.436979] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 764.436979] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] return self.greenlet.switch() [ 764.437334] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 764.437334] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] self.f(*self.args, **self.kw) [ 764.437334] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 764.437334] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] raise exceptions.translate_fault(task_info.error) [ 764.437334] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 764.437334] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Faults: ['InvalidArgument'] [ 764.437334] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] [ 764.437334] env[59857]: INFO nova.compute.manager [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Terminating instance [ 764.438473] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Acquired lock "[datastore2] devstack-image-cache_base/4a4a4830-1ff7-4cff-ab75-d665942f46b5/4a4a4830-1ff7-4cff-ab75-d665942f46b5.vmdk" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 764.438679] env[59857]: DEBUG nova.virt.vmwareapi.ds_util [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=59857) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 764.438920] env[59857]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-65aa0c7d-623b-478c-a55e-d3aaf49965e1 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 764.442285] env[59857]: DEBUG oslo_concurrency.lockutils [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Acquiring lock "refresh_cache-5c575e05-5a7c-49b8-b914-9b4a4e347bfc" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 764.442458] env[59857]: DEBUG oslo_concurrency.lockutils [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Acquired lock "refresh_cache-5c575e05-5a7c-49b8-b914-9b4a4e347bfc" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 764.442625] env[59857]: DEBUG nova.network.neutron [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 764.454231] env[59857]: DEBUG nova.virt.vmwareapi.ds_util [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=59857) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 764.454231] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=59857) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 764.454231] env[59857]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6e59b86e-799f-45b0-b0d9-03db2f7d128a {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 764.466808] env[59857]: DEBUG oslo_vmware.api [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Waiting for the task: (returnval){ [ 764.466808] env[59857]: value = "session[5271f050-9b9a-180d-7d1b-498ca49bf02c]52f8aeba-bdb7-f7d5-babf-7e7462662015" [ 764.466808] env[59857]: _type = "Task" [ 764.466808] env[59857]: } to complete. {{(pid=59857) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 764.476663] env[59857]: DEBUG oslo_vmware.api [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Task: {'id': session[5271f050-9b9a-180d-7d1b-498ca49bf02c]52f8aeba-bdb7-f7d5-babf-7e7462662015, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59857) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 764.500569] env[59857]: DEBUG nova.network.neutron [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 764.597014] env[59857]: DEBUG nova.network.neutron [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 764.609649] env[59857]: DEBUG oslo_concurrency.lockutils [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Releasing lock "refresh_cache-5c575e05-5a7c-49b8-b914-9b4a4e347bfc" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 764.610085] env[59857]: DEBUG nova.compute.manager [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Start destroying the instance on the hypervisor. {{(pid=59857) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 764.610292] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Destroying instance {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 764.611425] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b26c2ead-0eba-45e1-b6dd-a07afc58c847 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 764.621605] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Unregistering the VM {{(pid=59857) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 764.622197] env[59857]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-05f32a05-20c2-469b-bdec-c64beb3c5b97 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 764.657221] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Unregistered the VM {{(pid=59857) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 764.657221] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Deleting contents of the VM from datastore datastore2 {{(pid=59857) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 764.657221] env[59857]: DEBUG nova.virt.vmwareapi.ds_util [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Deleting the datastore file [datastore2] 5c575e05-5a7c-49b8-b914-9b4a4e347bfc {{(pid=59857) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 764.657221] env[59857]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e373d7ec-ae65-41f3-9859-a4a26c41d56b {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 764.664655] env[59857]: DEBUG oslo_vmware.api [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Waiting for the task: (returnval){ [ 764.664655] env[59857]: value = "task-1341447" [ 764.664655] env[59857]: _type = "Task" [ 764.664655] env[59857]: } to complete. {{(pid=59857) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 764.679190] env[59857]: DEBUG oslo_vmware.api [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Task: {'id': task-1341447, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59857) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 764.986279] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Preparing fetch location {{(pid=59857) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 764.987115] env[59857]: DEBUG nova.virt.vmwareapi.ds_util [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Creating directory with path [datastore2] vmware_temp/873ddbed-1ba6-45f2-9e0f-8161a9befc0a/4a4a4830-1ff7-4cff-ab75-d665942f46b5 {{(pid=59857) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 764.987483] env[59857]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a1a75597-7cbb-4e7d-a79b-9a02d612eef1 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 765.002998] env[59857]: DEBUG nova.virt.vmwareapi.ds_util [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Created directory with path [datastore2] vmware_temp/873ddbed-1ba6-45f2-9e0f-8161a9befc0a/4a4a4830-1ff7-4cff-ab75-d665942f46b5 {{(pid=59857) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 765.003456] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Fetch image to [datastore2] vmware_temp/873ddbed-1ba6-45f2-9e0f-8161a9befc0a/4a4a4830-1ff7-4cff-ab75-d665942f46b5/tmp-sparse.vmdk {{(pid=59857) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 765.003802] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Downloading image file data 4a4a4830-1ff7-4cff-ab75-d665942f46b5 to [datastore2] vmware_temp/873ddbed-1ba6-45f2-9e0f-8161a9befc0a/4a4a4830-1ff7-4cff-ab75-d665942f46b5/tmp-sparse.vmdk on the data store datastore2 {{(pid=59857) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 765.004899] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31950f58-8e3f-4860-8ebb-ac5fa6621316 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 765.020671] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-689910d6-d7d5-4064-b1bd-673e393308a1 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 765.034536] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be38e945-c98d-45fb-b7c9-4395255f1a8c {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 765.072336] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e5c4b28-05a1-426d-8475-88fad414f0bc {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 765.080042] env[59857]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-83e0f09b-3669-461d-99b6-12df56b57c06 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 765.177298] env[59857]: DEBUG nova.virt.vmwareapi.images [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Downloading image file data 4a4a4830-1ff7-4cff-ab75-d665942f46b5 to the data store datastore2 {{(pid=59857) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 765.185515] env[59857]: DEBUG oslo_vmware.api [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Task: {'id': task-1341447, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.041143} completed successfully. {{(pid=59857) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 765.186271] env[59857]: DEBUG nova.virt.vmwareapi.ds_util [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Deleted the datastore file {{(pid=59857) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 765.186271] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Deleted contents of the VM from datastore datastore2 {{(pid=59857) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 765.186271] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Instance destroyed {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 765.186954] env[59857]: INFO nova.compute.manager [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Took 0.58 seconds to destroy the instance on the hypervisor. [ 765.186954] env[59857]: DEBUG oslo.service.loopingcall [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59857) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 765.186954] env[59857]: DEBUG nova.compute.manager [-] [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Skipping network deallocation for instance since networking was not requested. {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 765.191222] env[59857]: DEBUG nova.compute.claims [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Aborting claim: {{(pid=59857) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 765.192465] env[59857]: DEBUG oslo_concurrency.lockutils [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 765.192465] env[59857]: DEBUG oslo_concurrency.lockutils [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 765.252593] env[59857]: DEBUG oslo_vmware.rw_handles [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/873ddbed-1ba6-45f2-9e0f-8161a9befc0a/4a4a4830-1ff7-4cff-ab75-d665942f46b5/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59857) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 765.320150] env[59857]: DEBUG oslo_vmware.rw_handles [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Completed reading data from the image iterator. {{(pid=59857) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 765.320371] env[59857]: DEBUG oslo_vmware.rw_handles [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/873ddbed-1ba6-45f2-9e0f-8161a9befc0a/4a4a4830-1ff7-4cff-ab75-d665942f46b5/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59857) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 765.396264] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1209e611-556f-42b5-be07-7536d65d13bf {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 765.415816] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31f5f228-ca27-4866-b3d9-83da9b3ee346 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 765.460841] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48a6c947-77a9-4be6-a520-36a6a6e0c5ed {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 765.470643] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7d51bc6-3321-4dce-ac59-6e895930c23a {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 765.486534] env[59857]: DEBUG nova.compute.provider_tree [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 765.502088] env[59857]: DEBUG nova.scheduler.client.report [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 765.521985] env[59857]: DEBUG oslo_concurrency.lockutils [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.330s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 765.522565] env[59857]: ERROR nova.compute.manager [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 765.522565] env[59857]: Faults: ['InvalidArgument'] [ 765.522565] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Traceback (most recent call last): [ 765.522565] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 765.522565] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] self.driver.spawn(context, instance, image_meta, [ 765.522565] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 765.522565] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] self._vmops.spawn(context, instance, image_meta, injected_files, [ 765.522565] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 765.522565] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] self._fetch_image_if_missing(context, vi) [ 765.522565] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 765.522565] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] image_cache(vi, tmp_image_ds_loc) [ 765.522565] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 765.522933] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] vm_util.copy_virtual_disk( [ 765.522933] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 765.522933] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] session._wait_for_task(vmdk_copy_task) [ 765.522933] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 765.522933] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] return self.wait_for_task(task_ref) [ 765.522933] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 765.522933] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] return evt.wait() [ 765.522933] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 765.522933] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] result = hub.switch() [ 765.522933] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 765.522933] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] return self.greenlet.switch() [ 765.522933] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 765.522933] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] self.f(*self.args, **self.kw) [ 765.523286] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 765.523286] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] raise exceptions.translate_fault(task_info.error) [ 765.523286] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 765.523286] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Faults: ['InvalidArgument'] [ 765.523286] env[59857]: ERROR nova.compute.manager [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] [ 765.523421] env[59857]: DEBUG nova.compute.utils [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] VimFaultException {{(pid=59857) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 765.525185] env[59857]: DEBUG nova.compute.manager [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Build of instance 5c575e05-5a7c-49b8-b914-9b4a4e347bfc was re-scheduled: A specified parameter was not correct: fileType [ 765.525185] env[59857]: Faults: ['InvalidArgument'] {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 765.525559] env[59857]: DEBUG nova.compute.manager [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Unplugging VIFs for instance {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 765.525776] env[59857]: DEBUG oslo_concurrency.lockutils [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Acquiring lock "refresh_cache-5c575e05-5a7c-49b8-b914-9b4a4e347bfc" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 765.525913] env[59857]: DEBUG oslo_concurrency.lockutils [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Acquired lock "refresh_cache-5c575e05-5a7c-49b8-b914-9b4a4e347bfc" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 765.526077] env[59857]: DEBUG nova.network.neutron [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 765.560776] env[59857]: DEBUG nova.network.neutron [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 765.834157] env[59857]: DEBUG nova.network.neutron [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 765.844539] env[59857]: DEBUG oslo_concurrency.lockutils [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Releasing lock "refresh_cache-5c575e05-5a7c-49b8-b914-9b4a4e347bfc" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 765.844791] env[59857]: DEBUG nova.compute.manager [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 765.844976] env[59857]: DEBUG nova.compute.manager [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Skipping network deallocation for instance since networking was not requested. {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 765.955583] env[59857]: INFO nova.scheduler.client.report [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Deleted allocations for instance 5c575e05-5a7c-49b8-b914-9b4a4e347bfc [ 765.976642] env[59857]: DEBUG oslo_concurrency.lockutils [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Lock "5c575e05-5a7c-49b8-b914-9b4a4e347bfc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 145.512s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 808.845246] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 808.845595] env[59857]: DEBUG nova.compute.manager [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Starting heal instance info cache {{(pid=59857) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 808.845595] env[59857]: DEBUG nova.compute.manager [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Rebuilding the list of instances to heal {{(pid=59857) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 808.857183] env[59857]: DEBUG nova.compute.manager [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Skipping network cache update for instance because it is Building. {{(pid=59857) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 808.857367] env[59857]: DEBUG nova.compute.manager [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] Skipping network cache update for instance because it is Building. {{(pid=59857) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 808.857470] env[59857]: DEBUG nova.compute.manager [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Didn't find any instances for network info cache update. {{(pid=59857) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 808.857922] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 809.840253] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 809.840489] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 809.840652] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager.update_available_resource {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 809.850325] env[59857]: DEBUG oslo_concurrency.lockutils [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 809.850641] env[59857]: DEBUG oslo_concurrency.lockutils [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 809.850678] env[59857]: DEBUG oslo_concurrency.lockutils [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 809.850817] env[59857]: DEBUG nova.compute.resource_tracker [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59857) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 809.851903] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2c1dca6-730f-45ff-b455-5877f3a0e589 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 809.860493] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b48aad54-9bda-4d09-8ff1-de188ed869bf {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 809.874165] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3171b09-ebaf-4fd7-997f-f08c91fcda10 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 809.880197] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-086e37a8-e522-43b4-9f61-0460b68ed89d {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 809.908054] env[59857]: DEBUG nova.compute.resource_tracker [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181502MB free_disk=154GB free_vcpus=48 pci_devices=None {{(pid=59857) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 809.908186] env[59857]: DEBUG oslo_concurrency.lockutils [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 809.908363] env[59857]: DEBUG oslo_concurrency.lockutils [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 809.945611] env[59857]: DEBUG nova.compute.resource_tracker [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Instance 11f468ba-a807-4490-9dd5-58eaad007865 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59857) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 809.945755] env[59857]: DEBUG nova.compute.resource_tracker [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Instance 26aa196e-e745-494d-814f-7da3cf18ec14 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59857) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 809.945922] env[59857]: DEBUG nova.compute.resource_tracker [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Total usable vcpus: 48, total allocated vcpus: 2 {{(pid=59857) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 809.946069] env[59857]: DEBUG nova.compute.resource_tracker [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=768MB phys_disk=200GB used_disk=2GB total_vcpus=48 used_vcpus=2 pci_stats=[] {{(pid=59857) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 809.981812] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c2a96ce-c1cd-4ec1-8454-6b7629eb3020 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 809.988727] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15dfd32e-bb8e-40f1-99d9-b6f47cfa266c {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 810.018390] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98d162a4-e837-4fde-9cca-462bc8f64899 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 810.024938] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35392274-49b1-4408-8781-bb35f86c56b3 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 810.037452] env[59857]: DEBUG nova.compute.provider_tree [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 810.045121] env[59857]: DEBUG nova.scheduler.client.report [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 810.057272] env[59857]: DEBUG nova.compute.resource_tracker [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59857) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 810.057436] env[59857]: DEBUG oslo_concurrency.lockutils [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.149s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 811.056739] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 811.057151] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 811.057151] env[59857]: DEBUG nova.compute.manager [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59857) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 811.835609] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 811.835827] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 811.848387] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 814.726429] env[59857]: WARNING oslo_vmware.rw_handles [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 814.726429] env[59857]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 814.726429] env[59857]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 814.726429] env[59857]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 814.726429] env[59857]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 814.726429] env[59857]: ERROR oslo_vmware.rw_handles response.begin() [ 814.726429] env[59857]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 814.726429] env[59857]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 814.726429] env[59857]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 814.726429] env[59857]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 814.726429] env[59857]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 814.726429] env[59857]: ERROR oslo_vmware.rw_handles [ 814.727255] env[59857]: DEBUG nova.virt.vmwareapi.images [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Downloaded image file data 4a4a4830-1ff7-4cff-ab75-d665942f46b5 to vmware_temp/873ddbed-1ba6-45f2-9e0f-8161a9befc0a/4a4a4830-1ff7-4cff-ab75-d665942f46b5/tmp-sparse.vmdk on the data store datastore2 {{(pid=59857) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 814.728911] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Caching image {{(pid=59857) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 814.729194] env[59857]: DEBUG nova.virt.vmwareapi.vm_util [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Copying Virtual Disk [datastore2] vmware_temp/873ddbed-1ba6-45f2-9e0f-8161a9befc0a/4a4a4830-1ff7-4cff-ab75-d665942f46b5/tmp-sparse.vmdk to [datastore2] vmware_temp/873ddbed-1ba6-45f2-9e0f-8161a9befc0a/4a4a4830-1ff7-4cff-ab75-d665942f46b5/4a4a4830-1ff7-4cff-ab75-d665942f46b5.vmdk {{(pid=59857) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 814.729498] env[59857]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-fc73dc28-aa90-4aa0-9dba-c893f21c73bc {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 814.737869] env[59857]: DEBUG oslo_vmware.api [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Waiting for the task: (returnval){ [ 814.737869] env[59857]: value = "task-1341460" [ 814.737869] env[59857]: _type = "Task" [ 814.737869] env[59857]: } to complete. {{(pid=59857) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 814.745823] env[59857]: DEBUG oslo_vmware.api [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Task: {'id': task-1341460, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59857) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 815.249093] env[59857]: DEBUG oslo_vmware.exceptions [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Fault InvalidArgument not matched. {{(pid=59857) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 815.249351] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Releasing lock "[datastore2] devstack-image-cache_base/4a4a4830-1ff7-4cff-ab75-d665942f46b5/4a4a4830-1ff7-4cff-ab75-d665942f46b5.vmdk" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 815.249877] env[59857]: ERROR nova.compute.manager [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 815.249877] env[59857]: Faults: ['InvalidArgument'] [ 815.249877] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Traceback (most recent call last): [ 815.249877] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 815.249877] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] yield resources [ 815.249877] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 815.249877] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] self.driver.spawn(context, instance, image_meta, [ 815.249877] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 815.249877] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] self._vmops.spawn(context, instance, image_meta, injected_files, [ 815.249877] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 815.249877] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] self._fetch_image_if_missing(context, vi) [ 815.249877] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 815.250383] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] image_cache(vi, tmp_image_ds_loc) [ 815.250383] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 815.250383] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] vm_util.copy_virtual_disk( [ 815.250383] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 815.250383] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] session._wait_for_task(vmdk_copy_task) [ 815.250383] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 815.250383] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] return self.wait_for_task(task_ref) [ 815.250383] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 815.250383] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] return evt.wait() [ 815.250383] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 815.250383] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] result = hub.switch() [ 815.250383] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 815.250383] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] return self.greenlet.switch() [ 815.250765] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 815.250765] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] self.f(*self.args, **self.kw) [ 815.250765] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 815.250765] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] raise exceptions.translate_fault(task_info.error) [ 815.250765] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 815.250765] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Faults: ['InvalidArgument'] [ 815.250765] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] [ 815.250765] env[59857]: INFO nova.compute.manager [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Terminating instance [ 815.251683] env[59857]: DEBUG oslo_concurrency.lockutils [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Acquired lock "[datastore2] devstack-image-cache_base/4a4a4830-1ff7-4cff-ab75-d665942f46b5/4a4a4830-1ff7-4cff-ab75-d665942f46b5.vmdk" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 815.251882] env[59857]: DEBUG nova.virt.vmwareapi.ds_util [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=59857) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 815.252126] env[59857]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f644552e-87a0-4331-95cc-099435f0c4ae {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 815.254120] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Acquiring lock "refresh_cache-11f468ba-a807-4490-9dd5-58eaad007865" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 815.254277] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Acquired lock "refresh_cache-11f468ba-a807-4490-9dd5-58eaad007865" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 815.254439] env[59857]: DEBUG nova.network.neutron [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 815.261052] env[59857]: DEBUG nova.virt.vmwareapi.ds_util [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=59857) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 815.261222] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=59857) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 815.262343] env[59857]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4617bafc-c69a-4746-95b4-e73e5bd079e1 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 815.270027] env[59857]: DEBUG oslo_vmware.api [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Waiting for the task: (returnval){ [ 815.270027] env[59857]: value = "session[5271f050-9b9a-180d-7d1b-498ca49bf02c]5273effe-b257-9eea-0c6e-33230aceb817" [ 815.270027] env[59857]: _type = "Task" [ 815.270027] env[59857]: } to complete. {{(pid=59857) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 815.278252] env[59857]: DEBUG oslo_vmware.api [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Task: {'id': session[5271f050-9b9a-180d-7d1b-498ca49bf02c]5273effe-b257-9eea-0c6e-33230aceb817, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59857) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 815.281209] env[59857]: DEBUG nova.network.neutron [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 815.340660] env[59857]: DEBUG nova.network.neutron [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 815.349480] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Releasing lock "refresh_cache-11f468ba-a807-4490-9dd5-58eaad007865" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 815.349850] env[59857]: DEBUG nova.compute.manager [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Start destroying the instance on the hypervisor. {{(pid=59857) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 815.350046] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Destroying instance {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 815.351065] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c215ac7b-77dc-4fe1-962a-d18f7fb0a98a {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 815.359014] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Unregistering the VM {{(pid=59857) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 815.359219] env[59857]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-646f261e-6571-43b0-ba6d-d6f441cfb1c9 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 815.410644] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Unregistered the VM {{(pid=59857) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 815.410890] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Deleting contents of the VM from datastore datastore2 {{(pid=59857) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 815.411051] env[59857]: DEBUG nova.virt.vmwareapi.ds_util [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Deleting the datastore file [datastore2] 11f468ba-a807-4490-9dd5-58eaad007865 {{(pid=59857) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 815.411319] env[59857]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-06a892b4-5ada-4a4a-a0a5-0f60edc1776c {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 815.417647] env[59857]: DEBUG oslo_vmware.api [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Waiting for the task: (returnval){ [ 815.417647] env[59857]: value = "task-1341462" [ 815.417647] env[59857]: _type = "Task" [ 815.417647] env[59857]: } to complete. {{(pid=59857) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 815.426287] env[59857]: DEBUG oslo_vmware.api [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Task: {'id': task-1341462, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59857) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 815.779879] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] Preparing fetch location {{(pid=59857) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 815.780301] env[59857]: DEBUG nova.virt.vmwareapi.ds_util [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Creating directory with path [datastore2] vmware_temp/0d9685a0-647d-4820-a540-4034c8ad1f8f/4a4a4830-1ff7-4cff-ab75-d665942f46b5 {{(pid=59857) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 815.780377] env[59857]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-35c3a13b-9fdf-48f1-b102-1095ad295cc7 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 815.791375] env[59857]: DEBUG nova.virt.vmwareapi.ds_util [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Created directory with path [datastore2] vmware_temp/0d9685a0-647d-4820-a540-4034c8ad1f8f/4a4a4830-1ff7-4cff-ab75-d665942f46b5 {{(pid=59857) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 815.791590] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] Fetch image to [datastore2] vmware_temp/0d9685a0-647d-4820-a540-4034c8ad1f8f/4a4a4830-1ff7-4cff-ab75-d665942f46b5/tmp-sparse.vmdk {{(pid=59857) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 815.791728] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] Downloading image file data 4a4a4830-1ff7-4cff-ab75-d665942f46b5 to [datastore2] vmware_temp/0d9685a0-647d-4820-a540-4034c8ad1f8f/4a4a4830-1ff7-4cff-ab75-d665942f46b5/tmp-sparse.vmdk on the data store datastore2 {{(pid=59857) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 815.792521] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97d90c2b-add6-4e5a-b6ca-908df8358b19 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 815.799785] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24ad599c-81b5-44a5-a077-dab23be41097 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 815.809013] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5010540-ae02-4c3e-843c-9835e3dec761 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 815.839065] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ebed364-5f61-4e5e-b046-fff3178e2a73 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 815.844492] env[59857]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-43719962-9a29-4c8a-81be-a318cc685586 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 815.927450] env[59857]: DEBUG oslo_vmware.api [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Task: {'id': task-1341462, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.030127} completed successfully. {{(pid=59857) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 815.927783] env[59857]: DEBUG nova.virt.vmwareapi.ds_util [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Deleted the datastore file {{(pid=59857) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 815.928034] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Deleted contents of the VM from datastore datastore2 {{(pid=59857) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 815.928274] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Instance destroyed {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 815.928524] env[59857]: INFO nova.compute.manager [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Took 0.58 seconds to destroy the instance on the hypervisor. [ 815.928816] env[59857]: DEBUG oslo.service.loopingcall [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59857) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 815.930411] env[59857]: DEBUG nova.compute.manager [-] [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Skipping network deallocation for instance since networking was not requested. {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 815.930820] env[59857]: DEBUG nova.virt.vmwareapi.images [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] Downloading image file data 4a4a4830-1ff7-4cff-ab75-d665942f46b5 to the data store datastore2 {{(pid=59857) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 815.934549] env[59857]: DEBUG nova.compute.claims [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Aborting claim: {{(pid=59857) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 815.934737] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 815.934941] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 815.977582] env[59857]: DEBUG oslo_vmware.rw_handles [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0d9685a0-647d-4820-a540-4034c8ad1f8f/4a4a4830-1ff7-4cff-ab75-d665942f46b5/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59857) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 816.037391] env[59857]: DEBUG oslo_vmware.rw_handles [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Completed reading data from the image iterator. {{(pid=59857) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 816.037577] env[59857]: DEBUG oslo_vmware.rw_handles [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0d9685a0-647d-4820-a540-4034c8ad1f8f/4a4a4830-1ff7-4cff-ab75-d665942f46b5/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59857) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 816.054533] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8aaa57ba-d8c2-411b-8e25-0abca45fb7eb {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 816.062188] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0ee6e8a-7366-46e1-b059-8e6f6793c6c6 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 816.092652] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dceed5dc-e4b7-442b-852a-e8a08dc4ceea {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 816.099231] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74cc1be5-8293-4bee-bc4f-91de0c933564 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 816.111783] env[59857]: DEBUG nova.compute.provider_tree [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 816.120157] env[59857]: DEBUG nova.scheduler.client.report [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 816.131515] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.196s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 816.131999] env[59857]: ERROR nova.compute.manager [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 816.131999] env[59857]: Faults: ['InvalidArgument'] [ 816.131999] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Traceback (most recent call last): [ 816.131999] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 816.131999] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] self.driver.spawn(context, instance, image_meta, [ 816.131999] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 816.131999] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] self._vmops.spawn(context, instance, image_meta, injected_files, [ 816.131999] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 816.131999] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] self._fetch_image_if_missing(context, vi) [ 816.131999] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 816.131999] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] image_cache(vi, tmp_image_ds_loc) [ 816.131999] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 816.132352] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] vm_util.copy_virtual_disk( [ 816.132352] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 816.132352] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] session._wait_for_task(vmdk_copy_task) [ 816.132352] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 816.132352] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] return self.wait_for_task(task_ref) [ 816.132352] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 816.132352] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] return evt.wait() [ 816.132352] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 816.132352] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] result = hub.switch() [ 816.132352] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 816.132352] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] return self.greenlet.switch() [ 816.132352] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 816.132352] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] self.f(*self.args, **self.kw) [ 816.132680] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 816.132680] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] raise exceptions.translate_fault(task_info.error) [ 816.132680] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 816.132680] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Faults: ['InvalidArgument'] [ 816.132680] env[59857]: ERROR nova.compute.manager [instance: 11f468ba-a807-4490-9dd5-58eaad007865] [ 816.132817] env[59857]: DEBUG nova.compute.utils [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] [instance: 11f468ba-a807-4490-9dd5-58eaad007865] VimFaultException {{(pid=59857) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 816.134056] env[59857]: DEBUG nova.compute.manager [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Build of instance 11f468ba-a807-4490-9dd5-58eaad007865 was re-scheduled: A specified parameter was not correct: fileType [ 816.134056] env[59857]: Faults: ['InvalidArgument'] {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 816.134433] env[59857]: DEBUG nova.compute.manager [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Unplugging VIFs for instance {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 816.134646] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Acquiring lock "refresh_cache-11f468ba-a807-4490-9dd5-58eaad007865" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 816.134786] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Acquired lock "refresh_cache-11f468ba-a807-4490-9dd5-58eaad007865" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 816.135101] env[59857]: DEBUG nova.network.neutron [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 816.156272] env[59857]: DEBUG nova.network.neutron [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 816.210084] env[59857]: DEBUG nova.network.neutron [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 816.218550] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Releasing lock "refresh_cache-11f468ba-a807-4490-9dd5-58eaad007865" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 816.218747] env[59857]: DEBUG nova.compute.manager [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 816.218917] env[59857]: DEBUG nova.compute.manager [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Skipping network deallocation for instance since networking was not requested. {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 816.296083] env[59857]: INFO nova.scheduler.client.report [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Deleted allocations for instance 11f468ba-a807-4490-9dd5-58eaad007865 [ 816.311577] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Lock "11f468ba-a807-4490-9dd5-58eaad007865" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 191.888s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 820.379306] env[59857]: DEBUG oslo_concurrency.lockutils [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Acquiring lock "e0c34b27-f25b-48b8-8ace-0bbe5380a336" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 820.379680] env[59857]: DEBUG oslo_concurrency.lockutils [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Lock "e0c34b27-f25b-48b8-8ace-0bbe5380a336" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 820.388949] env[59857]: DEBUG nova.compute.manager [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] Starting instance... {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 820.434778] env[59857]: DEBUG oslo_concurrency.lockutils [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 820.435015] env[59857]: DEBUG oslo_concurrency.lockutils [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 820.436616] env[59857]: INFO nova.compute.claims [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 820.516698] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2d333e4-2fa7-4b50-893d-dae5c991f89a {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 821.190728] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1203c5f7-79c3-4425-86ea-abc183edea4b {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 821.219549] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-148fca99-ce1c-449a-9e2a-3c3247e0478a {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 821.226925] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c271a2d-d00a-4f91-ba30-c5f725c9ebf9 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 821.240010] env[59857]: DEBUG nova.compute.provider_tree [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 821.248736] env[59857]: DEBUG nova.scheduler.client.report [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 821.261252] env[59857]: DEBUG oslo_concurrency.lockutils [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.826s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 821.261717] env[59857]: DEBUG nova.compute.manager [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] Start building networks asynchronously for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 821.295699] env[59857]: DEBUG nova.compute.utils [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Using /dev/sd instead of None {{(pid=59857) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 821.297074] env[59857]: DEBUG nova.compute.manager [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] Allocating IP information in the background. {{(pid=59857) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 821.297272] env[59857]: DEBUG nova.network.neutron [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] allocate_for_instance() {{(pid=59857) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 821.310022] env[59857]: DEBUG nova.compute.manager [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] Start building block device mappings for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 821.366220] env[59857]: DEBUG nova.policy [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9186471c7ae84396a91e6e3f929aded5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '599a0d1dc613482c952b7bc2c9024674', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59857) authorize /opt/stack/nova/nova/policy.py:203}} [ 821.378909] env[59857]: DEBUG nova.compute.manager [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] Start spawning the instance on the hypervisor. {{(pid=59857) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 821.402369] env[59857]: DEBUG nova.virt.hardware [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-23T14:43:14Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-23T14:42:57Z,direct_url=,disk_format='vmdk',id=4a4a4830-1ff7-4cff-ab75-d665942f46b5,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='88755ffc9d394d14928f0692771e61f5',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-23T14:42:58Z,virtual_size=,visibility=), allow threads: False {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 821.402650] env[59857]: DEBUG nova.virt.hardware [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Flavor limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 821.402759] env[59857]: DEBUG nova.virt.hardware [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Image limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 821.403043] env[59857]: DEBUG nova.virt.hardware [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Flavor pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 821.403087] env[59857]: DEBUG nova.virt.hardware [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Image pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 821.403207] env[59857]: DEBUG nova.virt.hardware [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 821.403407] env[59857]: DEBUG nova.virt.hardware [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 821.403559] env[59857]: DEBUG nova.virt.hardware [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 821.403720] env[59857]: DEBUG nova.virt.hardware [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Got 1 possible topologies {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 821.403875] env[59857]: DEBUG nova.virt.hardware [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 821.404052] env[59857]: DEBUG nova.virt.hardware [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 821.404908] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4184509-dd2c-407b-9d51-2564f72e5a67 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 821.413554] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-754d50a9-e719-46a3-bd03-7d9e50d88b7f {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 821.631758] env[59857]: DEBUG nova.network.neutron [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] Successfully created port: 629678d3-be29-4351-920d-9cd5c5600081 {{(pid=59857) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 822.338252] env[59857]: DEBUG nova.compute.manager [req-841cc090-a125-406f-8dcd-f50fd28e2922 req-73848c51-19a2-47f9-a3ba-1ada7a5f8f98 service nova] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] Received event network-changed-629678d3-be29-4351-920d-9cd5c5600081 {{(pid=59857) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 822.338462] env[59857]: DEBUG nova.compute.manager [req-841cc090-a125-406f-8dcd-f50fd28e2922 req-73848c51-19a2-47f9-a3ba-1ada7a5f8f98 service nova] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] Refreshing instance network info cache due to event network-changed-629678d3-be29-4351-920d-9cd5c5600081. {{(pid=59857) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 822.338665] env[59857]: DEBUG oslo_concurrency.lockutils [req-841cc090-a125-406f-8dcd-f50fd28e2922 req-73848c51-19a2-47f9-a3ba-1ada7a5f8f98 service nova] Acquiring lock "refresh_cache-e0c34b27-f25b-48b8-8ace-0bbe5380a336" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 822.338800] env[59857]: DEBUG oslo_concurrency.lockutils [req-841cc090-a125-406f-8dcd-f50fd28e2922 req-73848c51-19a2-47f9-a3ba-1ada7a5f8f98 service nova] Acquired lock "refresh_cache-e0c34b27-f25b-48b8-8ace-0bbe5380a336" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 822.338950] env[59857]: DEBUG nova.network.neutron [req-841cc090-a125-406f-8dcd-f50fd28e2922 req-73848c51-19a2-47f9-a3ba-1ada7a5f8f98 service nova] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] Refreshing network info cache for port 629678d3-be29-4351-920d-9cd5c5600081 {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 822.519913] env[59857]: DEBUG nova.network.neutron [req-841cc090-a125-406f-8dcd-f50fd28e2922 req-73848c51-19a2-47f9-a3ba-1ada7a5f8f98 service nova] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 822.674549] env[59857]: DEBUG nova.network.neutron [req-841cc090-a125-406f-8dcd-f50fd28e2922 req-73848c51-19a2-47f9-a3ba-1ada7a5f8f98 service nova] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 822.683426] env[59857]: DEBUG oslo_concurrency.lockutils [req-841cc090-a125-406f-8dcd-f50fd28e2922 req-73848c51-19a2-47f9-a3ba-1ada7a5f8f98 service nova] Releasing lock "refresh_cache-e0c34b27-f25b-48b8-8ace-0bbe5380a336" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 822.743480] env[59857]: ERROR nova.compute.manager [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 629678d3-be29-4351-920d-9cd5c5600081, please check neutron logs for more information. [ 822.743480] env[59857]: ERROR nova.compute.manager Traceback (most recent call last): [ 822.743480] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 822.743480] env[59857]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 822.743480] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 822.743480] env[59857]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 822.743480] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 822.743480] env[59857]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 822.743480] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 822.743480] env[59857]: ERROR nova.compute.manager self.force_reraise() [ 822.743480] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 822.743480] env[59857]: ERROR nova.compute.manager raise self.value [ 822.743480] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 822.743480] env[59857]: ERROR nova.compute.manager updated_port = self._update_port( [ 822.743480] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 822.743480] env[59857]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 822.744173] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 822.744173] env[59857]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 822.744173] env[59857]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 629678d3-be29-4351-920d-9cd5c5600081, please check neutron logs for more information. [ 822.744173] env[59857]: ERROR nova.compute.manager [ 822.744173] env[59857]: Traceback (most recent call last): [ 822.744173] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 822.744173] env[59857]: listener.cb(fileno) [ 822.744173] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 822.744173] env[59857]: result = function(*args, **kwargs) [ 822.744173] env[59857]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 822.744173] env[59857]: return func(*args, **kwargs) [ 822.744173] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 822.744173] env[59857]: raise e [ 822.744173] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 822.744173] env[59857]: nwinfo = self.network_api.allocate_for_instance( [ 822.744173] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 822.744173] env[59857]: created_port_ids = self._update_ports_for_instance( [ 822.744173] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 822.744173] env[59857]: with excutils.save_and_reraise_exception(): [ 822.744173] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 822.744173] env[59857]: self.force_reraise() [ 822.744173] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 822.744173] env[59857]: raise self.value [ 822.744173] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 822.744173] env[59857]: updated_port = self._update_port( [ 822.744173] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 822.744173] env[59857]: _ensure_no_port_binding_failure(port) [ 822.744173] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 822.744173] env[59857]: raise exception.PortBindingFailed(port_id=port['id']) [ 822.745098] env[59857]: nova.exception.PortBindingFailed: Binding failed for port 629678d3-be29-4351-920d-9cd5c5600081, please check neutron logs for more information. [ 822.745098] env[59857]: Removing descriptor: 12 [ 822.745098] env[59857]: ERROR nova.compute.manager [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 629678d3-be29-4351-920d-9cd5c5600081, please check neutron logs for more information. [ 822.745098] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] Traceback (most recent call last): [ 822.745098] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 822.745098] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] yield resources [ 822.745098] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 822.745098] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] self.driver.spawn(context, instance, image_meta, [ 822.745098] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 822.745098] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] self._vmops.spawn(context, instance, image_meta, injected_files, [ 822.745098] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 822.745098] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] vm_ref = self.build_virtual_machine(instance, [ 822.745500] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 822.745500] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] vif_infos = vmwarevif.get_vif_info(self._session, [ 822.745500] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 822.745500] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] for vif in network_info: [ 822.745500] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 822.745500] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] return self._sync_wrapper(fn, *args, **kwargs) [ 822.745500] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 822.745500] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] self.wait() [ 822.745500] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 822.745500] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] self[:] = self._gt.wait() [ 822.745500] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 822.745500] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] return self._exit_event.wait() [ 822.745500] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 822.745500] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] result = hub.switch() [ 822.745944] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 822.745944] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] return self.greenlet.switch() [ 822.745944] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 822.745944] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] result = function(*args, **kwargs) [ 822.745944] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 822.745944] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] return func(*args, **kwargs) [ 822.745944] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 822.745944] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] raise e [ 822.745944] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 822.745944] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] nwinfo = self.network_api.allocate_for_instance( [ 822.745944] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 822.745944] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] created_port_ids = self._update_ports_for_instance( [ 822.745944] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 822.746372] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] with excutils.save_and_reraise_exception(): [ 822.746372] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 822.746372] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] self.force_reraise() [ 822.746372] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 822.746372] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] raise self.value [ 822.746372] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 822.746372] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] updated_port = self._update_port( [ 822.746372] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 822.746372] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] _ensure_no_port_binding_failure(port) [ 822.746372] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 822.746372] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] raise exception.PortBindingFailed(port_id=port['id']) [ 822.746372] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] nova.exception.PortBindingFailed: Binding failed for port 629678d3-be29-4351-920d-9cd5c5600081, please check neutron logs for more information. [ 822.746372] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] [ 822.746774] env[59857]: INFO nova.compute.manager [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] Terminating instance [ 822.746774] env[59857]: DEBUG oslo_concurrency.lockutils [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Acquiring lock "refresh_cache-e0c34b27-f25b-48b8-8ace-0bbe5380a336" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 822.746774] env[59857]: DEBUG oslo_concurrency.lockutils [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Acquired lock "refresh_cache-e0c34b27-f25b-48b8-8ace-0bbe5380a336" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 822.746774] env[59857]: DEBUG nova.network.neutron [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 822.768544] env[59857]: DEBUG nova.network.neutron [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 822.865497] env[59857]: DEBUG nova.network.neutron [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 822.874892] env[59857]: DEBUG oslo_concurrency.lockutils [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Releasing lock "refresh_cache-e0c34b27-f25b-48b8-8ace-0bbe5380a336" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 822.875309] env[59857]: DEBUG nova.compute.manager [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] Start destroying the instance on the hypervisor. {{(pid=59857) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 822.875495] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] Destroying instance {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 822.875999] env[59857]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-ee564a23-c1c5-4751-a992-c40f0b8a539c {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 822.885032] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb62c04b-5b2e-47cc-869b-726146b89a01 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 822.906272] env[59857]: WARNING nova.virt.vmwareapi.vmops [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance e0c34b27-f25b-48b8-8ace-0bbe5380a336 could not be found. [ 822.906488] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] Instance destroyed {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 822.906676] env[59857]: INFO nova.compute.manager [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] Took 0.03 seconds to destroy the instance on the hypervisor. [ 822.906895] env[59857]: DEBUG oslo.service.loopingcall [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59857) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 822.907106] env[59857]: DEBUG nova.compute.manager [-] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 822.907250] env[59857]: DEBUG nova.network.neutron [-] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 822.922654] env[59857]: DEBUG nova.network.neutron [-] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 822.930037] env[59857]: DEBUG nova.network.neutron [-] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 822.937357] env[59857]: INFO nova.compute.manager [-] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] Took 0.03 seconds to deallocate network for instance. [ 822.939376] env[59857]: DEBUG nova.compute.claims [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] Aborting claim: {{(pid=59857) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 822.939547] env[59857]: DEBUG oslo_concurrency.lockutils [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 822.939749] env[59857]: DEBUG oslo_concurrency.lockutils [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 823.005583] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-649d936f-b64c-4b91-93ef-c71ea649a7ee {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 823.013066] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a05a945-7b81-4b65-a3dc-de8b49712dd2 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 823.042503] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf5298a3-367c-4805-afd6-f96023636071 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 823.048915] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-506a21f0-c006-4ebd-aa17-4592fdc77386 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 823.061380] env[59857]: DEBUG nova.compute.provider_tree [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 823.069101] env[59857]: DEBUG nova.scheduler.client.report [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 823.081146] env[59857]: DEBUG oslo_concurrency.lockutils [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.141s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 823.081717] env[59857]: ERROR nova.compute.manager [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 629678d3-be29-4351-920d-9cd5c5600081, please check neutron logs for more information. [ 823.081717] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] Traceback (most recent call last): [ 823.081717] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 823.081717] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] self.driver.spawn(context, instance, image_meta, [ 823.081717] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 823.081717] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] self._vmops.spawn(context, instance, image_meta, injected_files, [ 823.081717] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 823.081717] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] vm_ref = self.build_virtual_machine(instance, [ 823.081717] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 823.081717] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] vif_infos = vmwarevif.get_vif_info(self._session, [ 823.081717] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 823.082133] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] for vif in network_info: [ 823.082133] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 823.082133] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] return self._sync_wrapper(fn, *args, **kwargs) [ 823.082133] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 823.082133] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] self.wait() [ 823.082133] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 823.082133] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] self[:] = self._gt.wait() [ 823.082133] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 823.082133] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] return self._exit_event.wait() [ 823.082133] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 823.082133] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] result = hub.switch() [ 823.082133] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 823.082133] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] return self.greenlet.switch() [ 823.082133] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 823.082569] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] result = function(*args, **kwargs) [ 823.082569] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 823.082569] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] return func(*args, **kwargs) [ 823.082569] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 823.082569] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] raise e [ 823.082569] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 823.082569] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] nwinfo = self.network_api.allocate_for_instance( [ 823.082569] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 823.082569] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] created_port_ids = self._update_ports_for_instance( [ 823.082569] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 823.082569] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] with excutils.save_and_reraise_exception(): [ 823.082569] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 823.082569] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] self.force_reraise() [ 823.082966] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 823.082966] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] raise self.value [ 823.082966] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 823.082966] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] updated_port = self._update_port( [ 823.082966] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 823.082966] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] _ensure_no_port_binding_failure(port) [ 823.082966] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 823.082966] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] raise exception.PortBindingFailed(port_id=port['id']) [ 823.082966] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] nova.exception.PortBindingFailed: Binding failed for port 629678d3-be29-4351-920d-9cd5c5600081, please check neutron logs for more information. [ 823.082966] env[59857]: ERROR nova.compute.manager [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] [ 823.083300] env[59857]: DEBUG nova.compute.utils [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] Binding failed for port 629678d3-be29-4351-920d-9cd5c5600081, please check neutron logs for more information. {{(pid=59857) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 823.084059] env[59857]: DEBUG nova.compute.manager [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] Build of instance e0c34b27-f25b-48b8-8ace-0bbe5380a336 was re-scheduled: Binding failed for port 629678d3-be29-4351-920d-9cd5c5600081, please check neutron logs for more information. {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 823.084471] env[59857]: DEBUG nova.compute.manager [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] Unplugging VIFs for instance {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 823.084684] env[59857]: DEBUG oslo_concurrency.lockutils [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Acquiring lock "refresh_cache-e0c34b27-f25b-48b8-8ace-0bbe5380a336" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 823.084823] env[59857]: DEBUG oslo_concurrency.lockutils [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Acquired lock "refresh_cache-e0c34b27-f25b-48b8-8ace-0bbe5380a336" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 823.084976] env[59857]: DEBUG nova.network.neutron [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 823.106827] env[59857]: DEBUG nova.network.neutron [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 823.196355] env[59857]: DEBUG nova.network.neutron [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 823.204442] env[59857]: DEBUG oslo_concurrency.lockutils [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Releasing lock "refresh_cache-e0c34b27-f25b-48b8-8ace-0bbe5380a336" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 823.204649] env[59857]: DEBUG nova.compute.manager [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 823.204825] env[59857]: DEBUG nova.compute.manager [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 823.204982] env[59857]: DEBUG nova.network.neutron [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 823.218956] env[59857]: DEBUG nova.network.neutron [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 823.225468] env[59857]: DEBUG nova.network.neutron [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 823.232409] env[59857]: INFO nova.compute.manager [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] Took 0.03 seconds to deallocate network for instance. [ 823.308422] env[59857]: INFO nova.scheduler.client.report [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Deleted allocations for instance e0c34b27-f25b-48b8-8ace-0bbe5380a336 [ 823.322928] env[59857]: DEBUG oslo_concurrency.lockutils [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Lock "e0c34b27-f25b-48b8-8ace-0bbe5380a336" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 2.943s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 827.924531] env[59857]: DEBUG oslo_concurrency.lockutils [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Acquiring lock "4e0763e1-7348-472a-8838-648991382724" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 827.924809] env[59857]: DEBUG oslo_concurrency.lockutils [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Lock "4e0763e1-7348-472a-8838-648991382724" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 827.934344] env[59857]: DEBUG nova.compute.manager [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] [instance: 4e0763e1-7348-472a-8838-648991382724] Starting instance... {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 827.980495] env[59857]: DEBUG oslo_concurrency.lockutils [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 827.980808] env[59857]: DEBUG oslo_concurrency.lockutils [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 827.982244] env[59857]: INFO nova.compute.claims [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] [instance: 4e0763e1-7348-472a-8838-648991382724] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 828.061800] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9de3553-9c9e-45a3-95ff-faf162a76351 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 828.069176] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ba30b0e-bac7-43c4-81e1-e5946ba8bcb1 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 828.098467] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c565aa98-4a18-4465-9180-8dba41d8aa57 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 828.105570] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e1ae0bb-ea90-42a3-88dc-30932838f906 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 828.118233] env[59857]: DEBUG nova.compute.provider_tree [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 828.127013] env[59857]: DEBUG nova.scheduler.client.report [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 828.139437] env[59857]: DEBUG oslo_concurrency.lockutils [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.159s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 828.139889] env[59857]: DEBUG nova.compute.manager [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] [instance: 4e0763e1-7348-472a-8838-648991382724] Start building networks asynchronously for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 828.169802] env[59857]: DEBUG nova.compute.utils [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Using /dev/sd instead of None {{(pid=59857) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 828.171061] env[59857]: DEBUG nova.compute.manager [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] [instance: 4e0763e1-7348-472a-8838-648991382724] Allocating IP information in the background. {{(pid=59857) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 828.171256] env[59857]: DEBUG nova.network.neutron [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] [instance: 4e0763e1-7348-472a-8838-648991382724] allocate_for_instance() {{(pid=59857) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 828.179372] env[59857]: DEBUG nova.compute.manager [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] [instance: 4e0763e1-7348-472a-8838-648991382724] Start building block device mappings for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 828.227325] env[59857]: DEBUG nova.policy [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '233f69ec97334ba9894ae70d7941be35', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2b0a5316a9ae491c985d1f21a2ae77e1', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59857) authorize /opt/stack/nova/nova/policy.py:203}} [ 828.240377] env[59857]: DEBUG nova.compute.manager [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] [instance: 4e0763e1-7348-472a-8838-648991382724] Start spawning the instance on the hypervisor. {{(pid=59857) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 828.261383] env[59857]: DEBUG nova.virt.hardware [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-23T14:43:14Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-23T14:42:57Z,direct_url=,disk_format='vmdk',id=4a4a4830-1ff7-4cff-ab75-d665942f46b5,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='88755ffc9d394d14928f0692771e61f5',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-23T14:42:58Z,virtual_size=,visibility=), allow threads: False {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 828.261615] env[59857]: DEBUG nova.virt.hardware [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Flavor limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 828.261821] env[59857]: DEBUG nova.virt.hardware [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Image limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 828.261935] env[59857]: DEBUG nova.virt.hardware [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Flavor pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 828.262087] env[59857]: DEBUG nova.virt.hardware [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Image pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 828.262416] env[59857]: DEBUG nova.virt.hardware [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 828.262416] env[59857]: DEBUG nova.virt.hardware [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 828.262564] env[59857]: DEBUG nova.virt.hardware [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 828.262718] env[59857]: DEBUG nova.virt.hardware [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Got 1 possible topologies {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 828.262868] env[59857]: DEBUG nova.virt.hardware [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 828.263043] env[59857]: DEBUG nova.virt.hardware [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 828.264010] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b19f5433-54a3-4822-be24-bf72cf0f433c {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 828.275036] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68bac92a-3171-4eb6-93ba-1818996fb6fa {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 828.493927] env[59857]: DEBUG nova.network.neutron [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] [instance: 4e0763e1-7348-472a-8838-648991382724] Successfully created port: e71f4986-317b-4816-a3bd-ef4e509893a2 {{(pid=59857) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 829.264179] env[59857]: DEBUG nova.compute.manager [req-3cf40af3-e8d7-4c7e-9f83-b428e1f2de8b req-7d59fdbb-91ab-4a94-9c3a-13813ff2fc7a service nova] [instance: 4e0763e1-7348-472a-8838-648991382724] Received event network-changed-e71f4986-317b-4816-a3bd-ef4e509893a2 {{(pid=59857) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 829.264501] env[59857]: DEBUG nova.compute.manager [req-3cf40af3-e8d7-4c7e-9f83-b428e1f2de8b req-7d59fdbb-91ab-4a94-9c3a-13813ff2fc7a service nova] [instance: 4e0763e1-7348-472a-8838-648991382724] Refreshing instance network info cache due to event network-changed-e71f4986-317b-4816-a3bd-ef4e509893a2. {{(pid=59857) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 829.264652] env[59857]: DEBUG oslo_concurrency.lockutils [req-3cf40af3-e8d7-4c7e-9f83-b428e1f2de8b req-7d59fdbb-91ab-4a94-9c3a-13813ff2fc7a service nova] Acquiring lock "refresh_cache-4e0763e1-7348-472a-8838-648991382724" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 829.264786] env[59857]: DEBUG oslo_concurrency.lockutils [req-3cf40af3-e8d7-4c7e-9f83-b428e1f2de8b req-7d59fdbb-91ab-4a94-9c3a-13813ff2fc7a service nova] Acquired lock "refresh_cache-4e0763e1-7348-472a-8838-648991382724" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 829.264966] env[59857]: DEBUG nova.network.neutron [req-3cf40af3-e8d7-4c7e-9f83-b428e1f2de8b req-7d59fdbb-91ab-4a94-9c3a-13813ff2fc7a service nova] [instance: 4e0763e1-7348-472a-8838-648991382724] Refreshing network info cache for port e71f4986-317b-4816-a3bd-ef4e509893a2 {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 829.310019] env[59857]: DEBUG nova.network.neutron [req-3cf40af3-e8d7-4c7e-9f83-b428e1f2de8b req-7d59fdbb-91ab-4a94-9c3a-13813ff2fc7a service nova] [instance: 4e0763e1-7348-472a-8838-648991382724] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 829.453161] env[59857]: DEBUG nova.network.neutron [req-3cf40af3-e8d7-4c7e-9f83-b428e1f2de8b req-7d59fdbb-91ab-4a94-9c3a-13813ff2fc7a service nova] [instance: 4e0763e1-7348-472a-8838-648991382724] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 829.464222] env[59857]: DEBUG oslo_concurrency.lockutils [req-3cf40af3-e8d7-4c7e-9f83-b428e1f2de8b req-7d59fdbb-91ab-4a94-9c3a-13813ff2fc7a service nova] Releasing lock "refresh_cache-4e0763e1-7348-472a-8838-648991382724" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 829.524361] env[59857]: ERROR nova.compute.manager [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port e71f4986-317b-4816-a3bd-ef4e509893a2, please check neutron logs for more information. [ 829.524361] env[59857]: ERROR nova.compute.manager Traceback (most recent call last): [ 829.524361] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 829.524361] env[59857]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 829.524361] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 829.524361] env[59857]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 829.524361] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 829.524361] env[59857]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 829.524361] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 829.524361] env[59857]: ERROR nova.compute.manager self.force_reraise() [ 829.524361] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 829.524361] env[59857]: ERROR nova.compute.manager raise self.value [ 829.524361] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 829.524361] env[59857]: ERROR nova.compute.manager updated_port = self._update_port( [ 829.524361] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 829.524361] env[59857]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 829.524937] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 829.524937] env[59857]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 829.524937] env[59857]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port e71f4986-317b-4816-a3bd-ef4e509893a2, please check neutron logs for more information. [ 829.524937] env[59857]: ERROR nova.compute.manager [ 829.524937] env[59857]: Traceback (most recent call last): [ 829.524937] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 829.524937] env[59857]: listener.cb(fileno) [ 829.524937] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 829.524937] env[59857]: result = function(*args, **kwargs) [ 829.524937] env[59857]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 829.524937] env[59857]: return func(*args, **kwargs) [ 829.524937] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 829.524937] env[59857]: raise e [ 829.524937] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 829.524937] env[59857]: nwinfo = self.network_api.allocate_for_instance( [ 829.524937] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 829.524937] env[59857]: created_port_ids = self._update_ports_for_instance( [ 829.524937] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 829.524937] env[59857]: with excutils.save_and_reraise_exception(): [ 829.524937] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 829.524937] env[59857]: self.force_reraise() [ 829.524937] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 829.524937] env[59857]: raise self.value [ 829.524937] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 829.524937] env[59857]: updated_port = self._update_port( [ 829.524937] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 829.524937] env[59857]: _ensure_no_port_binding_failure(port) [ 829.524937] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 829.524937] env[59857]: raise exception.PortBindingFailed(port_id=port['id']) [ 829.525950] env[59857]: nova.exception.PortBindingFailed: Binding failed for port e71f4986-317b-4816-a3bd-ef4e509893a2, please check neutron logs for more information. [ 829.525950] env[59857]: Removing descriptor: 12 [ 829.525950] env[59857]: ERROR nova.compute.manager [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] [instance: 4e0763e1-7348-472a-8838-648991382724] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port e71f4986-317b-4816-a3bd-ef4e509893a2, please check neutron logs for more information. [ 829.525950] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] Traceback (most recent call last): [ 829.525950] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 829.525950] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] yield resources [ 829.525950] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 829.525950] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] self.driver.spawn(context, instance, image_meta, [ 829.525950] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 829.525950] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] self._vmops.spawn(context, instance, image_meta, injected_files, [ 829.525950] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 829.525950] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] vm_ref = self.build_virtual_machine(instance, [ 829.526390] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 829.526390] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] vif_infos = vmwarevif.get_vif_info(self._session, [ 829.526390] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 829.526390] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] for vif in network_info: [ 829.526390] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 829.526390] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] return self._sync_wrapper(fn, *args, **kwargs) [ 829.526390] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 829.526390] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] self.wait() [ 829.526390] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 829.526390] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] self[:] = self._gt.wait() [ 829.526390] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 829.526390] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] return self._exit_event.wait() [ 829.526390] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 829.526390] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] result = hub.switch() [ 829.526866] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 829.526866] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] return self.greenlet.switch() [ 829.526866] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 829.526866] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] result = function(*args, **kwargs) [ 829.526866] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 829.526866] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] return func(*args, **kwargs) [ 829.526866] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 829.526866] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] raise e [ 829.526866] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 829.526866] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] nwinfo = self.network_api.allocate_for_instance( [ 829.526866] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 829.526866] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] created_port_ids = self._update_ports_for_instance( [ 829.526866] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 829.527336] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] with excutils.save_and_reraise_exception(): [ 829.527336] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 829.527336] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] self.force_reraise() [ 829.527336] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 829.527336] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] raise self.value [ 829.527336] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 829.527336] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] updated_port = self._update_port( [ 829.527336] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 829.527336] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] _ensure_no_port_binding_failure(port) [ 829.527336] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 829.527336] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] raise exception.PortBindingFailed(port_id=port['id']) [ 829.527336] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] nova.exception.PortBindingFailed: Binding failed for port e71f4986-317b-4816-a3bd-ef4e509893a2, please check neutron logs for more information. [ 829.527336] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] [ 829.527775] env[59857]: INFO nova.compute.manager [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] [instance: 4e0763e1-7348-472a-8838-648991382724] Terminating instance [ 829.528556] env[59857]: DEBUG oslo_concurrency.lockutils [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Acquiring lock "refresh_cache-4e0763e1-7348-472a-8838-648991382724" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 829.528710] env[59857]: DEBUG oslo_concurrency.lockutils [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Acquired lock "refresh_cache-4e0763e1-7348-472a-8838-648991382724" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 829.528867] env[59857]: DEBUG nova.network.neutron [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] [instance: 4e0763e1-7348-472a-8838-648991382724] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 829.551235] env[59857]: DEBUG nova.network.neutron [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] [instance: 4e0763e1-7348-472a-8838-648991382724] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 829.646290] env[59857]: DEBUG nova.network.neutron [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] [instance: 4e0763e1-7348-472a-8838-648991382724] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 829.655381] env[59857]: DEBUG oslo_concurrency.lockutils [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Releasing lock "refresh_cache-4e0763e1-7348-472a-8838-648991382724" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 829.655780] env[59857]: DEBUG nova.compute.manager [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] [instance: 4e0763e1-7348-472a-8838-648991382724] Start destroying the instance on the hypervisor. {{(pid=59857) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 829.655962] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] [instance: 4e0763e1-7348-472a-8838-648991382724] Destroying instance {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 829.656754] env[59857]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-50673891-a2a5-4bb1-86ad-b59ee301c832 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 829.665994] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df7e8c8d-b573-4773-a811-6cf40e350858 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 829.687316] env[59857]: WARNING nova.virt.vmwareapi.vmops [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] [instance: 4e0763e1-7348-472a-8838-648991382724] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 4e0763e1-7348-472a-8838-648991382724 could not be found. [ 829.687576] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] [instance: 4e0763e1-7348-472a-8838-648991382724] Instance destroyed {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 829.687706] env[59857]: INFO nova.compute.manager [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] [instance: 4e0763e1-7348-472a-8838-648991382724] Took 0.03 seconds to destroy the instance on the hypervisor. [ 829.687927] env[59857]: DEBUG oslo.service.loopingcall [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59857) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 829.688141] env[59857]: DEBUG nova.compute.manager [-] [instance: 4e0763e1-7348-472a-8838-648991382724] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 829.688234] env[59857]: DEBUG nova.network.neutron [-] [instance: 4e0763e1-7348-472a-8838-648991382724] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 829.704603] env[59857]: DEBUG nova.network.neutron [-] [instance: 4e0763e1-7348-472a-8838-648991382724] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 829.713986] env[59857]: DEBUG nova.network.neutron [-] [instance: 4e0763e1-7348-472a-8838-648991382724] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 829.720946] env[59857]: INFO nova.compute.manager [-] [instance: 4e0763e1-7348-472a-8838-648991382724] Took 0.03 seconds to deallocate network for instance. [ 829.722829] env[59857]: DEBUG nova.compute.claims [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] [instance: 4e0763e1-7348-472a-8838-648991382724] Aborting claim: {{(pid=59857) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 829.723016] env[59857]: DEBUG oslo_concurrency.lockutils [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 829.723222] env[59857]: DEBUG oslo_concurrency.lockutils [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 829.795422] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11c56c1e-79ac-48fc-8d00-b45cec46f34a {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 829.803379] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3e4188e-4ee5-4572-a18c-7a038168d78d {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 829.834843] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e5dcdfc-ee5e-44ef-8465-7b7c256bca38 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 829.842056] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7afd78d-97a3-4811-97bd-0c8be8e9056c {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 829.854784] env[59857]: DEBUG nova.compute.provider_tree [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 829.863279] env[59857]: DEBUG nova.scheduler.client.report [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 829.875627] env[59857]: DEBUG oslo_concurrency.lockutils [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.152s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 829.877119] env[59857]: ERROR nova.compute.manager [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] [instance: 4e0763e1-7348-472a-8838-648991382724] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port e71f4986-317b-4816-a3bd-ef4e509893a2, please check neutron logs for more information. [ 829.877119] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] Traceback (most recent call last): [ 829.877119] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 829.877119] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] self.driver.spawn(context, instance, image_meta, [ 829.877119] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 829.877119] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] self._vmops.spawn(context, instance, image_meta, injected_files, [ 829.877119] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 829.877119] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] vm_ref = self.build_virtual_machine(instance, [ 829.877119] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 829.877119] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] vif_infos = vmwarevif.get_vif_info(self._session, [ 829.877119] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 829.877507] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] for vif in network_info: [ 829.877507] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 829.877507] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] return self._sync_wrapper(fn, *args, **kwargs) [ 829.877507] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 829.877507] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] self.wait() [ 829.877507] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 829.877507] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] self[:] = self._gt.wait() [ 829.877507] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 829.877507] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] return self._exit_event.wait() [ 829.877507] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 829.877507] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] result = hub.switch() [ 829.877507] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 829.877507] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] return self.greenlet.switch() [ 829.877507] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 829.877894] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] result = function(*args, **kwargs) [ 829.877894] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 829.877894] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] return func(*args, **kwargs) [ 829.877894] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 829.877894] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] raise e [ 829.877894] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 829.877894] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] nwinfo = self.network_api.allocate_for_instance( [ 829.877894] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 829.877894] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] created_port_ids = self._update_ports_for_instance( [ 829.877894] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 829.877894] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] with excutils.save_and_reraise_exception(): [ 829.877894] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 829.877894] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] self.force_reraise() [ 829.878435] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 829.878435] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] raise self.value [ 829.878435] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 829.878435] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] updated_port = self._update_port( [ 829.878435] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 829.878435] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] _ensure_no_port_binding_failure(port) [ 829.878435] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 829.878435] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] raise exception.PortBindingFailed(port_id=port['id']) [ 829.878435] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] nova.exception.PortBindingFailed: Binding failed for port e71f4986-317b-4816-a3bd-ef4e509893a2, please check neutron logs for more information. [ 829.878435] env[59857]: ERROR nova.compute.manager [instance: 4e0763e1-7348-472a-8838-648991382724] [ 829.878435] env[59857]: DEBUG nova.compute.utils [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] [instance: 4e0763e1-7348-472a-8838-648991382724] Binding failed for port e71f4986-317b-4816-a3bd-ef4e509893a2, please check neutron logs for more information. {{(pid=59857) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 829.878786] env[59857]: DEBUG nova.compute.manager [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] [instance: 4e0763e1-7348-472a-8838-648991382724] Build of instance 4e0763e1-7348-472a-8838-648991382724 was re-scheduled: Binding failed for port e71f4986-317b-4816-a3bd-ef4e509893a2, please check neutron logs for more information. {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 829.878786] env[59857]: DEBUG nova.compute.manager [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] [instance: 4e0763e1-7348-472a-8838-648991382724] Unplugging VIFs for instance {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 829.878947] env[59857]: DEBUG oslo_concurrency.lockutils [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Acquiring lock "refresh_cache-4e0763e1-7348-472a-8838-648991382724" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 829.879102] env[59857]: DEBUG oslo_concurrency.lockutils [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Acquired lock "refresh_cache-4e0763e1-7348-472a-8838-648991382724" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 829.879260] env[59857]: DEBUG nova.network.neutron [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] [instance: 4e0763e1-7348-472a-8838-648991382724] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 829.906015] env[59857]: DEBUG nova.network.neutron [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] [instance: 4e0763e1-7348-472a-8838-648991382724] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 829.993250] env[59857]: DEBUG nova.network.neutron [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] [instance: 4e0763e1-7348-472a-8838-648991382724] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 830.003725] env[59857]: DEBUG oslo_concurrency.lockutils [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Releasing lock "refresh_cache-4e0763e1-7348-472a-8838-648991382724" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 830.003934] env[59857]: DEBUG nova.compute.manager [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 830.004124] env[59857]: DEBUG nova.compute.manager [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] [instance: 4e0763e1-7348-472a-8838-648991382724] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 830.004283] env[59857]: DEBUG nova.network.neutron [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] [instance: 4e0763e1-7348-472a-8838-648991382724] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 830.021399] env[59857]: DEBUG nova.network.neutron [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] [instance: 4e0763e1-7348-472a-8838-648991382724] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 830.028325] env[59857]: DEBUG nova.network.neutron [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] [instance: 4e0763e1-7348-472a-8838-648991382724] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 830.035976] env[59857]: INFO nova.compute.manager [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] [instance: 4e0763e1-7348-472a-8838-648991382724] Took 0.03 seconds to deallocate network for instance. [ 830.111301] env[59857]: INFO nova.scheduler.client.report [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Deleted allocations for instance 4e0763e1-7348-472a-8838-648991382724 [ 830.128037] env[59857]: DEBUG oslo_concurrency.lockutils [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Lock "4e0763e1-7348-472a-8838-648991382724" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 2.203s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 862.775110] env[59857]: WARNING oslo_vmware.rw_handles [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 862.775110] env[59857]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 862.775110] env[59857]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 862.775110] env[59857]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 862.775110] env[59857]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 862.775110] env[59857]: ERROR oslo_vmware.rw_handles response.begin() [ 862.775110] env[59857]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 862.775110] env[59857]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 862.775110] env[59857]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 862.775110] env[59857]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 862.775110] env[59857]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 862.775110] env[59857]: ERROR oslo_vmware.rw_handles [ 862.775989] env[59857]: DEBUG nova.virt.vmwareapi.images [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] Downloaded image file data 4a4a4830-1ff7-4cff-ab75-d665942f46b5 to vmware_temp/0d9685a0-647d-4820-a540-4034c8ad1f8f/4a4a4830-1ff7-4cff-ab75-d665942f46b5/tmp-sparse.vmdk on the data store datastore2 {{(pid=59857) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 862.778072] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] Caching image {{(pid=59857) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 862.778316] env[59857]: DEBUG nova.virt.vmwareapi.vm_util [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Copying Virtual Disk [datastore2] vmware_temp/0d9685a0-647d-4820-a540-4034c8ad1f8f/4a4a4830-1ff7-4cff-ab75-d665942f46b5/tmp-sparse.vmdk to [datastore2] vmware_temp/0d9685a0-647d-4820-a540-4034c8ad1f8f/4a4a4830-1ff7-4cff-ab75-d665942f46b5/4a4a4830-1ff7-4cff-ab75-d665942f46b5.vmdk {{(pid=59857) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 862.778641] env[59857]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-e2be618e-7a49-40c8-9612-61795bdefb5b {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 862.786254] env[59857]: DEBUG oslo_vmware.api [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Waiting for the task: (returnval){ [ 862.786254] env[59857]: value = "task-1341463" [ 862.786254] env[59857]: _type = "Task" [ 862.786254] env[59857]: } to complete. {{(pid=59857) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 862.793965] env[59857]: DEBUG oslo_vmware.api [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Task: {'id': task-1341463, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59857) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 863.297534] env[59857]: DEBUG oslo_vmware.exceptions [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Fault InvalidArgument not matched. {{(pid=59857) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 863.297816] env[59857]: DEBUG oslo_concurrency.lockutils [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Releasing lock "[datastore2] devstack-image-cache_base/4a4a4830-1ff7-4cff-ab75-d665942f46b5/4a4a4830-1ff7-4cff-ab75-d665942f46b5.vmdk" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 863.298373] env[59857]: ERROR nova.compute.manager [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 863.298373] env[59857]: Faults: ['InvalidArgument'] [ 863.298373] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] Traceback (most recent call last): [ 863.298373] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 863.298373] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] yield resources [ 863.298373] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 863.298373] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] self.driver.spawn(context, instance, image_meta, [ 863.298373] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 863.298373] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] self._vmops.spawn(context, instance, image_meta, injected_files, [ 863.298373] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 863.298373] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] self._fetch_image_if_missing(context, vi) [ 863.298373] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 863.298788] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] image_cache(vi, tmp_image_ds_loc) [ 863.298788] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 863.298788] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] vm_util.copy_virtual_disk( [ 863.298788] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 863.298788] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] session._wait_for_task(vmdk_copy_task) [ 863.298788] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 863.298788] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] return self.wait_for_task(task_ref) [ 863.298788] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 863.298788] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] return evt.wait() [ 863.298788] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 863.298788] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] result = hub.switch() [ 863.298788] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 863.298788] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] return self.greenlet.switch() [ 863.299244] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 863.299244] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] self.f(*self.args, **self.kw) [ 863.299244] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 863.299244] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] raise exceptions.translate_fault(task_info.error) [ 863.299244] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 863.299244] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] Faults: ['InvalidArgument'] [ 863.299244] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] [ 863.299244] env[59857]: INFO nova.compute.manager [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] Terminating instance [ 863.301240] env[59857]: DEBUG oslo_concurrency.lockutils [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Acquiring lock "refresh_cache-26aa196e-e745-494d-814f-7da3cf18ec14" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 863.301389] env[59857]: DEBUG oslo_concurrency.lockutils [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Acquired lock "refresh_cache-26aa196e-e745-494d-814f-7da3cf18ec14" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 863.301548] env[59857]: DEBUG nova.network.neutron [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 863.325160] env[59857]: DEBUG nova.network.neutron [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 863.380893] env[59857]: DEBUG nova.network.neutron [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 863.389117] env[59857]: DEBUG oslo_concurrency.lockutils [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Releasing lock "refresh_cache-26aa196e-e745-494d-814f-7da3cf18ec14" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 863.389504] env[59857]: DEBUG nova.compute.manager [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] Start destroying the instance on the hypervisor. {{(pid=59857) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 863.389690] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] Destroying instance {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 863.390717] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ebd735e-5911-4b2f-b94a-37883b9243b7 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 863.398155] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] Unregistering the VM {{(pid=59857) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 863.398356] env[59857]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-f92f7e97-da58-4e21-b37b-58088e6c1a5e {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 863.424012] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] Unregistered the VM {{(pid=59857) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 863.424210] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] Deleting contents of the VM from datastore datastore2 {{(pid=59857) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 863.424378] env[59857]: DEBUG nova.virt.vmwareapi.ds_util [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Deleting the datastore file [datastore2] 26aa196e-e745-494d-814f-7da3cf18ec14 {{(pid=59857) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 863.424587] env[59857]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-a3266b72-31f3-4c1d-b784-f8b2ef4347f9 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 863.430069] env[59857]: DEBUG oslo_vmware.api [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Waiting for the task: (returnval){ [ 863.430069] env[59857]: value = "task-1341465" [ 863.430069] env[59857]: _type = "Task" [ 863.430069] env[59857]: } to complete. {{(pid=59857) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 863.437215] env[59857]: DEBUG oslo_vmware.api [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Task: {'id': task-1341465, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59857) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 863.939535] env[59857]: DEBUG oslo_vmware.api [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Task: {'id': task-1341465, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.029362} completed successfully. {{(pid=59857) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 863.939881] env[59857]: DEBUG nova.virt.vmwareapi.ds_util [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Deleted the datastore file {{(pid=59857) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 863.939946] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] Deleted contents of the VM from datastore datastore2 {{(pid=59857) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 863.940111] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] Instance destroyed {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 863.940280] env[59857]: INFO nova.compute.manager [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] Took 0.55 seconds to destroy the instance on the hypervisor. [ 863.940516] env[59857]: DEBUG oslo.service.loopingcall [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59857) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 863.940712] env[59857]: DEBUG nova.compute.manager [-] [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] Skipping network deallocation for instance since networking was not requested. {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 863.942721] env[59857]: DEBUG nova.compute.claims [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] Aborting claim: {{(pid=59857) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 863.942884] env[59857]: DEBUG oslo_concurrency.lockutils [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 863.943112] env[59857]: DEBUG oslo_concurrency.lockutils [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 864.000036] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-232f0936-ac2f-43f1-a7d2-1ce969922237 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 864.007236] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e6899e4-2e98-4dc9-b9bc-70f0ada32683 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 864.035890] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c86c0c29-954a-4541-a160-fa738f86d85a {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 864.042495] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c346dceb-6cc4-48b9-8e88-76eee231b957 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 864.056008] env[59857]: DEBUG nova.compute.provider_tree [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 864.063769] env[59857]: DEBUG nova.scheduler.client.report [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 864.078151] env[59857]: DEBUG oslo_concurrency.lockutils [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.135s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 864.078659] env[59857]: ERROR nova.compute.manager [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 864.078659] env[59857]: Faults: ['InvalidArgument'] [ 864.078659] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] Traceback (most recent call last): [ 864.078659] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 864.078659] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] self.driver.spawn(context, instance, image_meta, [ 864.078659] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 864.078659] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] self._vmops.spawn(context, instance, image_meta, injected_files, [ 864.078659] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 864.078659] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] self._fetch_image_if_missing(context, vi) [ 864.078659] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 864.078659] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] image_cache(vi, tmp_image_ds_loc) [ 864.078659] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 864.079033] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] vm_util.copy_virtual_disk( [ 864.079033] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 864.079033] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] session._wait_for_task(vmdk_copy_task) [ 864.079033] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 864.079033] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] return self.wait_for_task(task_ref) [ 864.079033] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 864.079033] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] return evt.wait() [ 864.079033] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 864.079033] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] result = hub.switch() [ 864.079033] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 864.079033] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] return self.greenlet.switch() [ 864.079033] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 864.079033] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] self.f(*self.args, **self.kw) [ 864.079651] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 864.079651] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] raise exceptions.translate_fault(task_info.error) [ 864.079651] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 864.079651] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] Faults: ['InvalidArgument'] [ 864.079651] env[59857]: ERROR nova.compute.manager [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] [ 864.079651] env[59857]: DEBUG nova.compute.utils [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] VimFaultException {{(pid=59857) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 864.080620] env[59857]: DEBUG nova.compute.manager [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] Build of instance 26aa196e-e745-494d-814f-7da3cf18ec14 was re-scheduled: A specified parameter was not correct: fileType [ 864.080620] env[59857]: Faults: ['InvalidArgument'] {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 864.081000] env[59857]: DEBUG nova.compute.manager [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] Unplugging VIFs for instance {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 864.081224] env[59857]: DEBUG oslo_concurrency.lockutils [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Acquiring lock "refresh_cache-26aa196e-e745-494d-814f-7da3cf18ec14" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 864.081365] env[59857]: DEBUG oslo_concurrency.lockutils [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Acquired lock "refresh_cache-26aa196e-e745-494d-814f-7da3cf18ec14" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 864.081519] env[59857]: DEBUG nova.network.neutron [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 864.105909] env[59857]: DEBUG nova.network.neutron [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 864.159131] env[59857]: DEBUG nova.network.neutron [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 864.167241] env[59857]: DEBUG oslo_concurrency.lockutils [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Releasing lock "refresh_cache-26aa196e-e745-494d-814f-7da3cf18ec14" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 864.167535] env[59857]: DEBUG nova.compute.manager [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 864.167722] env[59857]: DEBUG nova.compute.manager [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] Skipping network deallocation for instance since networking was not requested. {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 864.244043] env[59857]: INFO nova.scheduler.client.report [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Deleted allocations for instance 26aa196e-e745-494d-814f-7da3cf18ec14 [ 864.258945] env[59857]: DEBUG oslo_concurrency.lockutils [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Lock "26aa196e-e745-494d-814f-7da3cf18ec14" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 138.143s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 866.841183] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 866.841377] env[59857]: DEBUG nova.compute.manager [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Cleaning up deleted instances {{(pid=59857) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11101}} [ 866.853184] env[59857]: DEBUG nova.compute.manager [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] There are 0 instances to clean {{(pid=59857) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11110}} [ 866.853388] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 866.853520] env[59857]: DEBUG nova.compute.manager [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Cleaning up deleted instances with incomplete migration {{(pid=59857) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11139}} [ 866.862807] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 867.869033] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 868.724087] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Acquiring lock "4d789e06-d563-46a0-80fc-0040ce074bff" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 868.724311] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Lock "4d789e06-d563-46a0-80fc-0040ce074bff" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 868.733339] env[59857]: DEBUG nova.compute.manager [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] Starting instance... {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 868.783075] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 868.783338] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 868.785291] env[59857]: INFO nova.compute.claims [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 868.854957] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc564bd6-26f4-4b2a-a239-7fd3c6b2f090 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 868.863429] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4386a1f6-f138-4a2d-8fdf-8f89598efa09 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 868.893924] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa1de13f-f2a3-47f3-84f6-dee78b47b8fa {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 868.901405] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d208f12b-745d-439b-a174-d1af94989e11 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 868.914324] env[59857]: DEBUG nova.compute.provider_tree [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 868.922636] env[59857]: DEBUG nova.scheduler.client.report [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 868.935622] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.152s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 868.936097] env[59857]: DEBUG nova.compute.manager [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] Start building networks asynchronously for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 868.971177] env[59857]: DEBUG nova.compute.utils [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Using /dev/sd instead of None {{(pid=59857) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 868.972532] env[59857]: DEBUG nova.compute.manager [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] Allocating IP information in the background. {{(pid=59857) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 868.972702] env[59857]: DEBUG nova.network.neutron [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] allocate_for_instance() {{(pid=59857) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 868.983563] env[59857]: DEBUG nova.compute.manager [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] Start building block device mappings for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 869.028247] env[59857]: DEBUG nova.policy [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6ba985c594674d8ab57f66762d73fe52', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1aa856aa79e94290acfdb44c20d4a028', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59857) authorize /opt/stack/nova/nova/policy.py:203}} [ 869.067153] env[59857]: DEBUG nova.compute.manager [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] Start spawning the instance on the hypervisor. {{(pid=59857) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 869.088968] env[59857]: DEBUG nova.virt.hardware [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-23T14:43:14Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-23T14:42:57Z,direct_url=,disk_format='vmdk',id=4a4a4830-1ff7-4cff-ab75-d665942f46b5,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='88755ffc9d394d14928f0692771e61f5',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-23T14:42:58Z,virtual_size=,visibility=), allow threads: False {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 869.089082] env[59857]: DEBUG nova.virt.hardware [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Flavor limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 869.089259] env[59857]: DEBUG nova.virt.hardware [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Image limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 869.089476] env[59857]: DEBUG nova.virt.hardware [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Flavor pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 869.089567] env[59857]: DEBUG nova.virt.hardware [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Image pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 869.089708] env[59857]: DEBUG nova.virt.hardware [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 869.089934] env[59857]: DEBUG nova.virt.hardware [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 869.090421] env[59857]: DEBUG nova.virt.hardware [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 869.090421] env[59857]: DEBUG nova.virt.hardware [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Got 1 possible topologies {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 869.090524] env[59857]: DEBUG nova.virt.hardware [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 869.091041] env[59857]: DEBUG nova.virt.hardware [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 869.091495] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dadc83cd-2689-4bc5-9cb5-2b751b2ae655 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 869.099703] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a424b343-eba9-427f-848c-a0985496dbb0 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 869.315921] env[59857]: DEBUG nova.network.neutron [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] Successfully created port: 36f52190-23be-485b-91d7-1066f2ed40cf {{(pid=59857) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 869.840107] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 869.840315] env[59857]: DEBUG nova.compute.manager [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Starting heal instance info cache {{(pid=59857) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 869.840439] env[59857]: DEBUG nova.compute.manager [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Rebuilding the list of instances to heal {{(pid=59857) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 869.850819] env[59857]: DEBUG nova.compute.manager [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] Skipping network cache update for instance because it is Building. {{(pid=59857) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 869.850969] env[59857]: DEBUG nova.compute.manager [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Didn't find any instances for network info cache update. {{(pid=59857) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 869.851187] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 870.166861] env[59857]: DEBUG nova.compute.manager [req-01f50ecb-8a72-4d43-9f85-304c5da8fd14 req-474d4b4f-2a4a-4a08-9cbf-2a72cb53fd92 service nova] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] Received event network-changed-36f52190-23be-485b-91d7-1066f2ed40cf {{(pid=59857) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 870.167222] env[59857]: DEBUG nova.compute.manager [req-01f50ecb-8a72-4d43-9f85-304c5da8fd14 req-474d4b4f-2a4a-4a08-9cbf-2a72cb53fd92 service nova] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] Refreshing instance network info cache due to event network-changed-36f52190-23be-485b-91d7-1066f2ed40cf. {{(pid=59857) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 870.167352] env[59857]: DEBUG oslo_concurrency.lockutils [req-01f50ecb-8a72-4d43-9f85-304c5da8fd14 req-474d4b4f-2a4a-4a08-9cbf-2a72cb53fd92 service nova] Acquiring lock "refresh_cache-4d789e06-d563-46a0-80fc-0040ce074bff" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 870.167604] env[59857]: DEBUG oslo_concurrency.lockutils [req-01f50ecb-8a72-4d43-9f85-304c5da8fd14 req-474d4b4f-2a4a-4a08-9cbf-2a72cb53fd92 service nova] Acquired lock "refresh_cache-4d789e06-d563-46a0-80fc-0040ce074bff" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 870.167862] env[59857]: DEBUG nova.network.neutron [req-01f50ecb-8a72-4d43-9f85-304c5da8fd14 req-474d4b4f-2a4a-4a08-9cbf-2a72cb53fd92 service nova] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] Refreshing network info cache for port 36f52190-23be-485b-91d7-1066f2ed40cf {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 870.212914] env[59857]: DEBUG nova.network.neutron [req-01f50ecb-8a72-4d43-9f85-304c5da8fd14 req-474d4b4f-2a4a-4a08-9cbf-2a72cb53fd92 service nova] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 870.357240] env[59857]: DEBUG nova.network.neutron [req-01f50ecb-8a72-4d43-9f85-304c5da8fd14 req-474d4b4f-2a4a-4a08-9cbf-2a72cb53fd92 service nova] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 870.366163] env[59857]: DEBUG oslo_concurrency.lockutils [req-01f50ecb-8a72-4d43-9f85-304c5da8fd14 req-474d4b4f-2a4a-4a08-9cbf-2a72cb53fd92 service nova] Releasing lock "refresh_cache-4d789e06-d563-46a0-80fc-0040ce074bff" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 870.409561] env[59857]: ERROR nova.compute.manager [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 36f52190-23be-485b-91d7-1066f2ed40cf, please check neutron logs for more information. [ 870.409561] env[59857]: ERROR nova.compute.manager Traceback (most recent call last): [ 870.409561] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 870.409561] env[59857]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 870.409561] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 870.409561] env[59857]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 870.409561] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 870.409561] env[59857]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 870.409561] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 870.409561] env[59857]: ERROR nova.compute.manager self.force_reraise() [ 870.409561] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 870.409561] env[59857]: ERROR nova.compute.manager raise self.value [ 870.409561] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 870.409561] env[59857]: ERROR nova.compute.manager updated_port = self._update_port( [ 870.409561] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 870.409561] env[59857]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 870.410160] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 870.410160] env[59857]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 870.410160] env[59857]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 36f52190-23be-485b-91d7-1066f2ed40cf, please check neutron logs for more information. [ 870.410160] env[59857]: ERROR nova.compute.manager [ 870.410160] env[59857]: Traceback (most recent call last): [ 870.410160] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 870.410160] env[59857]: listener.cb(fileno) [ 870.410160] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 870.410160] env[59857]: result = function(*args, **kwargs) [ 870.410160] env[59857]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 870.410160] env[59857]: return func(*args, **kwargs) [ 870.410160] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 870.410160] env[59857]: raise e [ 870.410160] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 870.410160] env[59857]: nwinfo = self.network_api.allocate_for_instance( [ 870.410160] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 870.410160] env[59857]: created_port_ids = self._update_ports_for_instance( [ 870.410160] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 870.410160] env[59857]: with excutils.save_and_reraise_exception(): [ 870.410160] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 870.410160] env[59857]: self.force_reraise() [ 870.410160] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 870.410160] env[59857]: raise self.value [ 870.410160] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 870.410160] env[59857]: updated_port = self._update_port( [ 870.410160] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 870.410160] env[59857]: _ensure_no_port_binding_failure(port) [ 870.410160] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 870.410160] env[59857]: raise exception.PortBindingFailed(port_id=port['id']) [ 870.411355] env[59857]: nova.exception.PortBindingFailed: Binding failed for port 36f52190-23be-485b-91d7-1066f2ed40cf, please check neutron logs for more information. [ 870.411355] env[59857]: Removing descriptor: 12 [ 870.411355] env[59857]: ERROR nova.compute.manager [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 36f52190-23be-485b-91d7-1066f2ed40cf, please check neutron logs for more information. [ 870.411355] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] Traceback (most recent call last): [ 870.411355] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 870.411355] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] yield resources [ 870.411355] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 870.411355] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] self.driver.spawn(context, instance, image_meta, [ 870.411355] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 870.411355] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] self._vmops.spawn(context, instance, image_meta, injected_files, [ 870.411355] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 870.411355] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] vm_ref = self.build_virtual_machine(instance, [ 870.411905] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 870.411905] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] vif_infos = vmwarevif.get_vif_info(self._session, [ 870.411905] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 870.411905] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] for vif in network_info: [ 870.411905] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 870.411905] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] return self._sync_wrapper(fn, *args, **kwargs) [ 870.411905] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 870.411905] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] self.wait() [ 870.411905] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 870.411905] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] self[:] = self._gt.wait() [ 870.411905] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 870.411905] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] return self._exit_event.wait() [ 870.411905] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 870.411905] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] result = hub.switch() [ 870.412397] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 870.412397] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] return self.greenlet.switch() [ 870.412397] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 870.412397] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] result = function(*args, **kwargs) [ 870.412397] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 870.412397] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] return func(*args, **kwargs) [ 870.412397] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 870.412397] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] raise e [ 870.412397] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 870.412397] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] nwinfo = self.network_api.allocate_for_instance( [ 870.412397] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 870.412397] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] created_port_ids = self._update_ports_for_instance( [ 870.412397] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 870.412837] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] with excutils.save_and_reraise_exception(): [ 870.412837] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 870.412837] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] self.force_reraise() [ 870.412837] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 870.412837] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] raise self.value [ 870.412837] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 870.412837] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] updated_port = self._update_port( [ 870.412837] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 870.412837] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] _ensure_no_port_binding_failure(port) [ 870.412837] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 870.412837] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] raise exception.PortBindingFailed(port_id=port['id']) [ 870.412837] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] nova.exception.PortBindingFailed: Binding failed for port 36f52190-23be-485b-91d7-1066f2ed40cf, please check neutron logs for more information. [ 870.412837] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] [ 870.413310] env[59857]: INFO nova.compute.manager [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] Terminating instance [ 870.413310] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Acquiring lock "refresh_cache-4d789e06-d563-46a0-80fc-0040ce074bff" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 870.413310] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Acquired lock "refresh_cache-4d789e06-d563-46a0-80fc-0040ce074bff" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 870.413310] env[59857]: DEBUG nova.network.neutron [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 870.437951] env[59857]: DEBUG nova.network.neutron [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 870.530936] env[59857]: DEBUG nova.network.neutron [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 870.539734] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Releasing lock "refresh_cache-4d789e06-d563-46a0-80fc-0040ce074bff" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 870.540488] env[59857]: DEBUG nova.compute.manager [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] Start destroying the instance on the hypervisor. {{(pid=59857) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 870.540488] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] Destroying instance {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 870.542583] env[59857]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-bffd743b-d57a-47b8-99f3-4241a69d5e5d {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 870.550921] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-332982b3-b839-4774-a99a-d942a5457922 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 870.571710] env[59857]: WARNING nova.virt.vmwareapi.vmops [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 4d789e06-d563-46a0-80fc-0040ce074bff could not be found. [ 870.571908] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] Instance destroyed {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 870.572099] env[59857]: INFO nova.compute.manager [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] Took 0.03 seconds to destroy the instance on the hypervisor. [ 870.572327] env[59857]: DEBUG oslo.service.loopingcall [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59857) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 870.572522] env[59857]: DEBUG nova.compute.manager [-] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 870.572610] env[59857]: DEBUG nova.network.neutron [-] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 870.589356] env[59857]: DEBUG nova.network.neutron [-] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 870.596622] env[59857]: DEBUG nova.network.neutron [-] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 870.603976] env[59857]: INFO nova.compute.manager [-] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] Took 0.03 seconds to deallocate network for instance. [ 870.605755] env[59857]: DEBUG nova.compute.claims [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] Aborting claim: {{(pid=59857) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 870.605891] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 870.606106] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 870.717279] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09422260-0d20-4962-9e2b-36f62e6f0a6e {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 870.724824] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce64d092-41d7-4e43-a6e5-65ffa64a8212 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 870.755046] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f908d9c-7037-40c2-9b2e-6460121faa05 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 870.761665] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb5d1dcb-af71-4a57-801e-d45b709a13bd {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 870.774370] env[59857]: DEBUG nova.compute.provider_tree [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 870.782009] env[59857]: DEBUG nova.scheduler.client.report [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 870.794261] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.188s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 870.794837] env[59857]: ERROR nova.compute.manager [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 36f52190-23be-485b-91d7-1066f2ed40cf, please check neutron logs for more information. [ 870.794837] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] Traceback (most recent call last): [ 870.794837] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 870.794837] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] self.driver.spawn(context, instance, image_meta, [ 870.794837] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 870.794837] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] self._vmops.spawn(context, instance, image_meta, injected_files, [ 870.794837] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 870.794837] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] vm_ref = self.build_virtual_machine(instance, [ 870.794837] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 870.794837] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] vif_infos = vmwarevif.get_vif_info(self._session, [ 870.794837] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 870.795297] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] for vif in network_info: [ 870.795297] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 870.795297] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] return self._sync_wrapper(fn, *args, **kwargs) [ 870.795297] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 870.795297] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] self.wait() [ 870.795297] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 870.795297] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] self[:] = self._gt.wait() [ 870.795297] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 870.795297] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] return self._exit_event.wait() [ 870.795297] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 870.795297] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] result = hub.switch() [ 870.795297] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 870.795297] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] return self.greenlet.switch() [ 870.795297] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 870.795906] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] result = function(*args, **kwargs) [ 870.795906] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 870.795906] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] return func(*args, **kwargs) [ 870.795906] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 870.795906] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] raise e [ 870.795906] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 870.795906] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] nwinfo = self.network_api.allocate_for_instance( [ 870.795906] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 870.795906] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] created_port_ids = self._update_ports_for_instance( [ 870.795906] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 870.795906] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] with excutils.save_and_reraise_exception(): [ 870.795906] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 870.795906] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] self.force_reraise() [ 870.796354] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 870.796354] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] raise self.value [ 870.796354] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 870.796354] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] updated_port = self._update_port( [ 870.796354] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 870.796354] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] _ensure_no_port_binding_failure(port) [ 870.796354] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 870.796354] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] raise exception.PortBindingFailed(port_id=port['id']) [ 870.796354] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] nova.exception.PortBindingFailed: Binding failed for port 36f52190-23be-485b-91d7-1066f2ed40cf, please check neutron logs for more information. [ 870.796354] env[59857]: ERROR nova.compute.manager [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] [ 870.796354] env[59857]: DEBUG nova.compute.utils [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] Binding failed for port 36f52190-23be-485b-91d7-1066f2ed40cf, please check neutron logs for more information. {{(pid=59857) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 870.796733] env[59857]: DEBUG nova.compute.manager [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] Build of instance 4d789e06-d563-46a0-80fc-0040ce074bff was re-scheduled: Binding failed for port 36f52190-23be-485b-91d7-1066f2ed40cf, please check neutron logs for more information. {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 870.797148] env[59857]: DEBUG nova.compute.manager [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] Unplugging VIFs for instance {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 870.797362] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Acquiring lock "refresh_cache-4d789e06-d563-46a0-80fc-0040ce074bff" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 870.797503] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Acquired lock "refresh_cache-4d789e06-d563-46a0-80fc-0040ce074bff" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 870.797684] env[59857]: DEBUG nova.network.neutron [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 870.818555] env[59857]: DEBUG nova.network.neutron [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 870.840359] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 870.840359] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 870.909487] env[59857]: DEBUG nova.network.neutron [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 870.917439] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Releasing lock "refresh_cache-4d789e06-d563-46a0-80fc-0040ce074bff" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 870.917707] env[59857]: DEBUG nova.compute.manager [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 870.917998] env[59857]: DEBUG nova.compute.manager [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 870.918181] env[59857]: DEBUG nova.network.neutron [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 870.932258] env[59857]: DEBUG nova.network.neutron [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 870.938950] env[59857]: DEBUG nova.network.neutron [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 870.946150] env[59857]: INFO nova.compute.manager [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] Took 0.03 seconds to deallocate network for instance. [ 871.024431] env[59857]: INFO nova.scheduler.client.report [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Deleted allocations for instance 4d789e06-d563-46a0-80fc-0040ce074bff [ 871.041126] env[59857]: DEBUG oslo_concurrency.lockutils [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Lock "4d789e06-d563-46a0-80fc-0040ce074bff" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 2.317s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 871.835253] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 871.840931] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 871.840931] env[59857]: DEBUG nova.compute.manager [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59857) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 871.840931] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager.update_available_resource {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 871.850657] env[59857]: DEBUG oslo_concurrency.lockutils [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 871.850860] env[59857]: DEBUG oslo_concurrency.lockutils [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 871.851026] env[59857]: DEBUG oslo_concurrency.lockutils [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 871.851186] env[59857]: DEBUG nova.compute.resource_tracker [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59857) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 871.852258] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-588a8e28-0504-45ee-abe1-32f5edd60183 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 871.861139] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9dae8e03-1a4d-42a9-bbcc-ac4f3f3dec75 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 871.875008] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec1bb62d-df0f-4c7e-ae2f-83970cf8710f {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 871.881299] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d67e5dcc-b6bb-4840-8b33-02c707849c5c {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 871.910545] env[59857]: DEBUG nova.compute.resource_tracker [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181484MB free_disk=154GB free_vcpus=48 pci_devices=None {{(pid=59857) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 871.910939] env[59857]: DEBUG oslo_concurrency.lockutils [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 871.910939] env[59857]: DEBUG oslo_concurrency.lockutils [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 871.958022] env[59857]: DEBUG nova.compute.resource_tracker [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=59857) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 871.958022] env[59857]: DEBUG nova.compute.resource_tracker [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=59857) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 871.972838] env[59857]: DEBUG nova.scheduler.client.report [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Refreshing inventories for resource provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 871.985261] env[59857]: DEBUG nova.scheduler.client.report [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Updating ProviderTree inventory for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 871.986048] env[59857]: DEBUG nova.compute.provider_tree [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Updating inventory in ProviderTree for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 871.994770] env[59857]: DEBUG nova.scheduler.client.report [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Refreshing aggregate associations for resource provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11, aggregates: None {{(pid=59857) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 872.011338] env[59857]: DEBUG nova.scheduler.client.report [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Refreshing trait associations for resource provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11, traits: COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE {{(pid=59857) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 872.023990] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-241e45af-1ab4-4ce4-b24d-b6861b76695f {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 872.032051] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8e71754-1052-42fe-a410-7f51ee00f020 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 872.061763] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2759efef-a009-4dba-bf87-e06452d9e207 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 872.069018] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd89483e-0646-4304-9366-b80a257b789f {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 872.083250] env[59857]: DEBUG nova.compute.provider_tree [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 872.091117] env[59857]: DEBUG nova.scheduler.client.report [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 872.103526] env[59857]: DEBUG nova.compute.resource_tracker [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59857) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 872.103809] env[59857]: DEBUG oslo_concurrency.lockutils [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.193s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 873.104565] env[59857]: DEBUG oslo_service.periodic_task [None req-d7342279-227d-4e91-b417-3e8a535c86cf None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59857) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 873.255029] env[59857]: DEBUG oslo_concurrency.lockutils [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Acquiring lock "305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 873.255029] env[59857]: DEBUG oslo_concurrency.lockutils [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Lock "305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 873.264374] env[59857]: DEBUG nova.compute.manager [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] Starting instance... {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 873.313753] env[59857]: DEBUG oslo_concurrency.lockutils [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 873.313993] env[59857]: DEBUG oslo_concurrency.lockutils [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 873.315546] env[59857]: INFO nova.compute.claims [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 873.382696] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c19f16a-d6ee-4e4f-a414-575b80347b16 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 873.390526] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39f291b9-60ae-440e-bb16-3732428e0fe2 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 873.419368] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-950bc9c6-5d36-4660-8939-3737e952409d {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 873.426108] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-538b8b3f-0174-4b62-9bba-724211e469c4 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 873.440313] env[59857]: DEBUG nova.compute.provider_tree [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 873.450102] env[59857]: DEBUG nova.scheduler.client.report [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 873.461808] env[59857]: DEBUG oslo_concurrency.lockutils [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.148s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 873.462251] env[59857]: DEBUG nova.compute.manager [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] Start building networks asynchronously for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 873.491543] env[59857]: DEBUG nova.compute.utils [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Using /dev/sd instead of None {{(pid=59857) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 873.492624] env[59857]: DEBUG nova.compute.manager [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] Allocating IP information in the background. {{(pid=59857) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 873.492799] env[59857]: DEBUG nova.network.neutron [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] allocate_for_instance() {{(pid=59857) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 873.502259] env[59857]: DEBUG nova.compute.manager [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] Start building block device mappings for instance. {{(pid=59857) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 873.545486] env[59857]: DEBUG nova.policy [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6ba985c594674d8ab57f66762d73fe52', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1aa856aa79e94290acfdb44c20d4a028', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59857) authorize /opt/stack/nova/nova/policy.py:203}} [ 873.561423] env[59857]: DEBUG nova.compute.manager [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] Start spawning the instance on the hypervisor. {{(pid=59857) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 873.580881] env[59857]: DEBUG nova.virt.hardware [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-23T14:43:14Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-23T14:42:57Z,direct_url=,disk_format='vmdk',id=4a4a4830-1ff7-4cff-ab75-d665942f46b5,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='88755ffc9d394d14928f0692771e61f5',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-23T14:42:58Z,virtual_size=,visibility=), allow threads: False {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 873.581120] env[59857]: DEBUG nova.virt.hardware [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Flavor limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 873.581276] env[59857]: DEBUG nova.virt.hardware [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Image limits 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 873.581451] env[59857]: DEBUG nova.virt.hardware [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Flavor pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 873.581590] env[59857]: DEBUG nova.virt.hardware [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Image pref 0:0:0 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 873.581732] env[59857]: DEBUG nova.virt.hardware [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59857) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 873.581929] env[59857]: DEBUG nova.virt.hardware [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 873.582093] env[59857]: DEBUG nova.virt.hardware [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 873.582259] env[59857]: DEBUG nova.virt.hardware [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Got 1 possible topologies {{(pid=59857) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 873.582416] env[59857]: DEBUG nova.virt.hardware [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 873.582585] env[59857]: DEBUG nova.virt.hardware [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59857) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 873.583436] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23633081-2772-4173-8f38-27e8af3b5d93 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 873.591227] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3d7f4f8-20e9-447d-b833-fa0ef7b16789 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 873.820165] env[59857]: DEBUG nova.network.neutron [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] Successfully created port: 63e41031-ef05-42ad-984e-0c452e5e8238 {{(pid=59857) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 874.528013] env[59857]: DEBUG nova.compute.manager [req-9b65f93a-54b3-4163-8e10-b1deceb57bd8 req-2fdcd25c-d5fa-4863-82b1-b9db691866aa service nova] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] Received event network-changed-63e41031-ef05-42ad-984e-0c452e5e8238 {{(pid=59857) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 874.528264] env[59857]: DEBUG nova.compute.manager [req-9b65f93a-54b3-4163-8e10-b1deceb57bd8 req-2fdcd25c-d5fa-4863-82b1-b9db691866aa service nova] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] Refreshing instance network info cache due to event network-changed-63e41031-ef05-42ad-984e-0c452e5e8238. {{(pid=59857) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 874.528541] env[59857]: DEBUG oslo_concurrency.lockutils [req-9b65f93a-54b3-4163-8e10-b1deceb57bd8 req-2fdcd25c-d5fa-4863-82b1-b9db691866aa service nova] Acquiring lock "refresh_cache-305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 874.528541] env[59857]: DEBUG oslo_concurrency.lockutils [req-9b65f93a-54b3-4163-8e10-b1deceb57bd8 req-2fdcd25c-d5fa-4863-82b1-b9db691866aa service nova] Acquired lock "refresh_cache-305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 874.528719] env[59857]: DEBUG nova.network.neutron [req-9b65f93a-54b3-4163-8e10-b1deceb57bd8 req-2fdcd25c-d5fa-4863-82b1-b9db691866aa service nova] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] Refreshing network info cache for port 63e41031-ef05-42ad-984e-0c452e5e8238 {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 874.580725] env[59857]: DEBUG nova.network.neutron [req-9b65f93a-54b3-4163-8e10-b1deceb57bd8 req-2fdcd25c-d5fa-4863-82b1-b9db691866aa service nova] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 874.730762] env[59857]: DEBUG nova.network.neutron [req-9b65f93a-54b3-4163-8e10-b1deceb57bd8 req-2fdcd25c-d5fa-4863-82b1-b9db691866aa service nova] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 874.740758] env[59857]: DEBUG oslo_concurrency.lockutils [req-9b65f93a-54b3-4163-8e10-b1deceb57bd8 req-2fdcd25c-d5fa-4863-82b1-b9db691866aa service nova] Releasing lock "refresh_cache-305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 874.774402] env[59857]: ERROR nova.compute.manager [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 63e41031-ef05-42ad-984e-0c452e5e8238, please check neutron logs for more information. [ 874.774402] env[59857]: ERROR nova.compute.manager Traceback (most recent call last): [ 874.774402] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 874.774402] env[59857]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 874.774402] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 874.774402] env[59857]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 874.774402] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 874.774402] env[59857]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 874.774402] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 874.774402] env[59857]: ERROR nova.compute.manager self.force_reraise() [ 874.774402] env[59857]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 874.774402] env[59857]: ERROR nova.compute.manager raise self.value [ 874.774402] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 874.774402] env[59857]: ERROR nova.compute.manager updated_port = self._update_port( [ 874.774402] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 874.774402] env[59857]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 874.774918] env[59857]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 874.774918] env[59857]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 874.774918] env[59857]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 63e41031-ef05-42ad-984e-0c452e5e8238, please check neutron logs for more information. [ 874.774918] env[59857]: ERROR nova.compute.manager [ 874.774918] env[59857]: Traceback (most recent call last): [ 874.774918] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 874.774918] env[59857]: listener.cb(fileno) [ 874.774918] env[59857]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 874.774918] env[59857]: result = function(*args, **kwargs) [ 874.774918] env[59857]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 874.774918] env[59857]: return func(*args, **kwargs) [ 874.774918] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 874.774918] env[59857]: raise e [ 874.774918] env[59857]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 874.774918] env[59857]: nwinfo = self.network_api.allocate_for_instance( [ 874.774918] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 874.774918] env[59857]: created_port_ids = self._update_ports_for_instance( [ 874.774918] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 874.774918] env[59857]: with excutils.save_and_reraise_exception(): [ 874.774918] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 874.774918] env[59857]: self.force_reraise() [ 874.774918] env[59857]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 874.774918] env[59857]: raise self.value [ 874.774918] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 874.774918] env[59857]: updated_port = self._update_port( [ 874.774918] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 874.774918] env[59857]: _ensure_no_port_binding_failure(port) [ 874.774918] env[59857]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 874.774918] env[59857]: raise exception.PortBindingFailed(port_id=port['id']) [ 874.775848] env[59857]: nova.exception.PortBindingFailed: Binding failed for port 63e41031-ef05-42ad-984e-0c452e5e8238, please check neutron logs for more information. [ 874.775848] env[59857]: Removing descriptor: 12 [ 874.775848] env[59857]: ERROR nova.compute.manager [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 63e41031-ef05-42ad-984e-0c452e5e8238, please check neutron logs for more information. [ 874.775848] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] Traceback (most recent call last): [ 874.775848] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 874.775848] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] yield resources [ 874.775848] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 874.775848] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] self.driver.spawn(context, instance, image_meta, [ 874.775848] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 874.775848] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] self._vmops.spawn(context, instance, image_meta, injected_files, [ 874.775848] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 874.775848] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] vm_ref = self.build_virtual_machine(instance, [ 874.776345] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 874.776345] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] vif_infos = vmwarevif.get_vif_info(self._session, [ 874.776345] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 874.776345] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] for vif in network_info: [ 874.776345] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 874.776345] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] return self._sync_wrapper(fn, *args, **kwargs) [ 874.776345] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 874.776345] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] self.wait() [ 874.776345] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 874.776345] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] self[:] = self._gt.wait() [ 874.776345] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 874.776345] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] return self._exit_event.wait() [ 874.776345] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 874.776345] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] result = hub.switch() [ 874.776745] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 874.776745] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] return self.greenlet.switch() [ 874.776745] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 874.776745] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] result = function(*args, **kwargs) [ 874.776745] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 874.776745] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] return func(*args, **kwargs) [ 874.776745] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 874.776745] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] raise e [ 874.776745] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 874.776745] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] nwinfo = self.network_api.allocate_for_instance( [ 874.776745] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 874.776745] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] created_port_ids = self._update_ports_for_instance( [ 874.776745] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 874.777195] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] with excutils.save_and_reraise_exception(): [ 874.777195] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 874.777195] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] self.force_reraise() [ 874.777195] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 874.777195] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] raise self.value [ 874.777195] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 874.777195] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] updated_port = self._update_port( [ 874.777195] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 874.777195] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] _ensure_no_port_binding_failure(port) [ 874.777195] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 874.777195] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] raise exception.PortBindingFailed(port_id=port['id']) [ 874.777195] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] nova.exception.PortBindingFailed: Binding failed for port 63e41031-ef05-42ad-984e-0c452e5e8238, please check neutron logs for more information. [ 874.777195] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] [ 874.778077] env[59857]: INFO nova.compute.manager [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] Terminating instance [ 874.780262] env[59857]: DEBUG oslo_concurrency.lockutils [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Acquiring lock "refresh_cache-305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 874.780416] env[59857]: DEBUG oslo_concurrency.lockutils [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Acquired lock "refresh_cache-305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 874.780578] env[59857]: DEBUG nova.network.neutron [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 874.803720] env[59857]: DEBUG nova.network.neutron [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 875.043931] env[59857]: DEBUG nova.network.neutron [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 875.052386] env[59857]: DEBUG oslo_concurrency.lockutils [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Releasing lock "refresh_cache-305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 875.052762] env[59857]: DEBUG nova.compute.manager [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] Start destroying the instance on the hypervisor. {{(pid=59857) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 875.052947] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] Destroying instance {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 875.053414] env[59857]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-39d0f128-099a-4714-84a6-e2c9f4c11552 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 875.062551] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13c868c2-b2d8-4217-8e53-15e464b0a82a {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 875.085378] env[59857]: WARNING nova.virt.vmwareapi.vmops [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165 could not be found. [ 875.085549] env[59857]: DEBUG nova.virt.vmwareapi.vmops [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] Instance destroyed {{(pid=59857) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 875.085748] env[59857]: INFO nova.compute.manager [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] Took 0.03 seconds to destroy the instance on the hypervisor. [ 875.085976] env[59857]: DEBUG oslo.service.loopingcall [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59857) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 875.086191] env[59857]: DEBUG nova.compute.manager [-] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 875.086282] env[59857]: DEBUG nova.network.neutron [-] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 875.101149] env[59857]: DEBUG nova.network.neutron [-] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 875.107874] env[59857]: DEBUG nova.network.neutron [-] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 875.115074] env[59857]: INFO nova.compute.manager [-] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] Took 0.03 seconds to deallocate network for instance. [ 875.116871] env[59857]: DEBUG nova.compute.claims [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] Aborting claim: {{(pid=59857) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 875.117050] env[59857]: DEBUG oslo_concurrency.lockutils [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 875.117258] env[59857]: DEBUG oslo_concurrency.lockutils [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 875.173578] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5cbf23d2-4c30-4852-9a14-1e01ef7060e3 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 875.181250] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3d105b2-0216-4e3f-9f6b-8b9f66529476 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 875.209903] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-abbc3b5e-df28-4c68-9a26-a4d8b0b31856 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 875.216906] env[59857]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5bc61087-9024-461d-b894-e69295148dd8 {{(pid=59857) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 875.229892] env[59857]: DEBUG nova.compute.provider_tree [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Inventory has not changed in ProviderTree for provider: 80c650ad-13a5-4d4e-96b3-a14b31abfa11 {{(pid=59857) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 875.238045] env[59857]: DEBUG nova.scheduler.client.report [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Inventory has not changed for provider 80c650ad-13a5-4d4e-96b3-a14b31abfa11 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 154, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59857) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 875.250368] env[59857]: DEBUG oslo_concurrency.lockutils [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.133s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 875.250969] env[59857]: ERROR nova.compute.manager [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 63e41031-ef05-42ad-984e-0c452e5e8238, please check neutron logs for more information. [ 875.250969] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] Traceback (most recent call last): [ 875.250969] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 875.250969] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] self.driver.spawn(context, instance, image_meta, [ 875.250969] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 875.250969] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] self._vmops.spawn(context, instance, image_meta, injected_files, [ 875.250969] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 875.250969] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] vm_ref = self.build_virtual_machine(instance, [ 875.250969] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 875.250969] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] vif_infos = vmwarevif.get_vif_info(self._session, [ 875.250969] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 875.251304] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] for vif in network_info: [ 875.251304] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 875.251304] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] return self._sync_wrapper(fn, *args, **kwargs) [ 875.251304] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 875.251304] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] self.wait() [ 875.251304] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 875.251304] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] self[:] = self._gt.wait() [ 875.251304] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 875.251304] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] return self._exit_event.wait() [ 875.251304] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 875.251304] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] result = hub.switch() [ 875.251304] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 875.251304] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] return self.greenlet.switch() [ 875.251304] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 875.251866] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] result = function(*args, **kwargs) [ 875.251866] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 875.251866] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] return func(*args, **kwargs) [ 875.251866] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 875.251866] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] raise e [ 875.251866] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 875.251866] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] nwinfo = self.network_api.allocate_for_instance( [ 875.251866] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 875.251866] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] created_port_ids = self._update_ports_for_instance( [ 875.251866] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 875.251866] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] with excutils.save_and_reraise_exception(): [ 875.251866] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 875.251866] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] self.force_reraise() [ 875.252668] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 875.252668] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] raise self.value [ 875.252668] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 875.252668] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] updated_port = self._update_port( [ 875.252668] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 875.252668] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] _ensure_no_port_binding_failure(port) [ 875.252668] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 875.252668] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] raise exception.PortBindingFailed(port_id=port['id']) [ 875.252668] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] nova.exception.PortBindingFailed: Binding failed for port 63e41031-ef05-42ad-984e-0c452e5e8238, please check neutron logs for more information. [ 875.252668] env[59857]: ERROR nova.compute.manager [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] [ 875.252668] env[59857]: DEBUG nova.compute.utils [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] Binding failed for port 63e41031-ef05-42ad-984e-0c452e5e8238, please check neutron logs for more information. {{(pid=59857) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 875.253160] env[59857]: DEBUG nova.compute.manager [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] Build of instance 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165 was re-scheduled: Binding failed for port 63e41031-ef05-42ad-984e-0c452e5e8238, please check neutron logs for more information. {{(pid=59857) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 875.253435] env[59857]: DEBUG nova.compute.manager [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] Unplugging VIFs for instance {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 875.253653] env[59857]: DEBUG oslo_concurrency.lockutils [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Acquiring lock "refresh_cache-305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 875.253796] env[59857]: DEBUG oslo_concurrency.lockutils [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Acquired lock "refresh_cache-305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 875.253976] env[59857]: DEBUG nova.network.neutron [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] Building network info cache for instance {{(pid=59857) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 875.276847] env[59857]: DEBUG nova.network.neutron [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 875.375531] env[59857]: DEBUG nova.network.neutron [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 875.384746] env[59857]: DEBUG oslo_concurrency.lockutils [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Releasing lock "refresh_cache-305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165" {{(pid=59857) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 875.384986] env[59857]: DEBUG nova.compute.manager [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59857) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 875.385206] env[59857]: DEBUG nova.compute.manager [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] Deallocating network for instance {{(pid=59857) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 875.385372] env[59857]: DEBUG nova.network.neutron [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] deallocate_for_instance() {{(pid=59857) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 875.402695] env[59857]: DEBUG nova.network.neutron [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] Instance cache missing network info. {{(pid=59857) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 875.411213] env[59857]: DEBUG nova.network.neutron [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] Updating instance_info_cache with network_info: [] {{(pid=59857) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 875.420206] env[59857]: INFO nova.compute.manager [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] Took 0.03 seconds to deallocate network for instance. [ 875.508739] env[59857]: INFO nova.scheduler.client.report [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Deleted allocations for instance 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165 [ 875.528962] env[59857]: DEBUG oslo_concurrency.lockutils [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Lock "305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 2.274s {{(pid=59857) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}}