[ 556.551145] env[59491]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 557.000654] env[59534]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 558.547396] env[59534]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=59534) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 558.547746] env[59534]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=59534) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 558.552066] env[59534]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=59534) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 558.552066] env[59534]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs [ 558.552066] env[59534]: WARNING nova.servicegroup.api [-] Report interval must be less than service down time. Current config: . Setting service_down_time to: 300 [ 558.666796] env[59534]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=59534) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} [ 558.677382] env[59534]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.011s {{(pid=59534) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} [ 558.778091] env[59534]: INFO nova.virt.driver [None req-4c6bc501-a867-41de-a9c8-20ab13780f16 None None] Loading compute driver 'vmwareapi.VMwareVCDriver' [ 558.850972] env[59534]: DEBUG oslo_concurrency.lockutils [-] Acquiring lock "oslo_vmware_api_lock" by "oslo_vmware.api.VMwareAPISession._create_session" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 558.851192] env[59534]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" acquired by "oslo_vmware.api.VMwareAPISession._create_session" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 558.851326] env[59534]: DEBUG oslo_vmware.service [-] Creating suds client with soap_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk' and wsdl_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk/vimService.wsdl' {{(pid=59534) __init__ /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:242}} [ 562.114180] env[59534]: DEBUG oslo_vmware.service [-] Invoking ServiceInstance.RetrieveServiceContent with opID=oslo.vmware-b668baaa-1411-4b92-bbee-02ca130639c4 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.130313] env[59534]: DEBUG oslo_vmware.api [-] Logging into host: vc1.osci.c.eu-de-1.cloud.sap. {{(pid=59534) _create_session /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:242}} [ 562.130455] env[59534]: DEBUG oslo_vmware.service [-] Invoking SessionManager.Login with opID=oslo.vmware-da127908-1140-47c6-99f6-de90dce73fa0 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.166187] env[59534]: INFO oslo_vmware.api [-] Successfully established new session; session ID is 9e29f. [ 562.166348] env[59534]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" "released" by "oslo_vmware.api.VMwareAPISession._create_session" :: held 3.315s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 562.166958] env[59534]: INFO nova.virt.vmwareapi.driver [None req-4c6bc501-a867-41de-a9c8-20ab13780f16 None None] VMware vCenter version: 7.0.3 [ 562.170328] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8bd15186-3453-4cad-b57b-6e75a8ec325b {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.189877] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e2e7fcf-9769-48d4-9976-90ca5c7b4a53 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.196233] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-582ba4b5-597b-4f70-beef-d0d35d0e595b {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.203628] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00bfb896-4706-4d45-8220-94c105758057 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.216571] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d82dde8c-ec8f-4aba-9873-b8ca02cab71f {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.222474] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-727fe5f4-00b0-463d-bc98-032f14bf8f3f {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.252376] env[59534]: DEBUG oslo_vmware.service [-] Invoking ExtensionManager.FindExtension with opID=oslo.vmware-15fcd5dd-cb32-44e5-bf8e-24c3ef048dd7 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.257704] env[59534]: DEBUG nova.virt.vmwareapi.driver [None req-4c6bc501-a867-41de-a9c8-20ab13780f16 None None] Extension org.openstack.compute already exists. {{(pid=59534) _register_openstack_extension /opt/stack/nova/nova/virt/vmwareapi/driver.py:214}} [ 562.260289] env[59534]: INFO nova.compute.provider_config [None req-4c6bc501-a867-41de-a9c8-20ab13780f16 None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. [ 562.277116] env[59534]: DEBUG nova.context [None req-4c6bc501-a867-41de-a9c8-20ab13780f16 None None] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),43d564f7-5143-479a-991c-c11659f6272d(cell1) {{(pid=59534) load_cells /opt/stack/nova/nova/context.py:464}} [ 562.279059] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 562.279277] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 562.280038] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 562.280380] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] Acquiring lock "43d564f7-5143-479a-991c-c11659f6272d" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 562.280569] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] Lock "43d564f7-5143-479a-991c-c11659f6272d" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 562.281531] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] Lock "43d564f7-5143-479a-991c-c11659f6272d" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 562.294059] env[59534]: DEBUG oslo_db.sqlalchemy.engines [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=59534) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 562.294673] env[59534]: DEBUG oslo_db.sqlalchemy.engines [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=59534) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 562.300564] env[59534]: ERROR nova.db.main.api [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] No DB access allowed in nova-compute: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 562.300564] env[59534]: result = function(*args, **kwargs) [ 562.300564] env[59534]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 562.300564] env[59534]: return func(*args, **kwargs) [ 562.300564] env[59534]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 562.300564] env[59534]: result = fn(*args, **kwargs) [ 562.300564] env[59534]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 562.300564] env[59534]: return f(*args, **kwargs) [ 562.300564] env[59534]: File "/opt/stack/nova/nova/objects/service.py", line 546, in _db_service_get_minimum_version [ 562.300564] env[59534]: return db.service_get_minimum_version(context, binaries) [ 562.300564] env[59534]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 562.300564] env[59534]: _check_db_access() [ 562.300564] env[59534]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 562.300564] env[59534]: stacktrace = ''.join(traceback.format_stack()) [ 562.300564] env[59534]: [ 562.301970] env[59534]: ERROR nova.db.main.api [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] No DB access allowed in nova-compute: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 562.301970] env[59534]: result = function(*args, **kwargs) [ 562.301970] env[59534]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 562.301970] env[59534]: return func(*args, **kwargs) [ 562.301970] env[59534]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 562.301970] env[59534]: result = fn(*args, **kwargs) [ 562.301970] env[59534]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 562.301970] env[59534]: return f(*args, **kwargs) [ 562.301970] env[59534]: File "/opt/stack/nova/nova/objects/service.py", line 546, in _db_service_get_minimum_version [ 562.301970] env[59534]: return db.service_get_minimum_version(context, binaries) [ 562.301970] env[59534]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 562.301970] env[59534]: _check_db_access() [ 562.301970] env[59534]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 562.301970] env[59534]: stacktrace = ''.join(traceback.format_stack()) [ 562.301970] env[59534]: [ 562.302572] env[59534]: WARNING nova.objects.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] Failed to get minimum service version for cell 00000000-0000-0000-0000-000000000000 [ 562.302572] env[59534]: WARNING nova.objects.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] Failed to get minimum service version for cell 43d564f7-5143-479a-991c-c11659f6272d [ 562.302853] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] Acquiring lock "singleton_lock" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 562.303014] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] Acquired lock "singleton_lock" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 562.303299] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] Releasing lock "singleton_lock" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 562.303564] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] Full set of CONF: {{(pid=59534) _wait_for_exit_or_signal /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:362}} [ 562.303705] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] ******************************************************************************** {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2589}} [ 562.303829] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] Configuration options gathered from: {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2590}} [ 562.303960] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] command line args: ['--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-cpu-common.conf', '--config-file', '/etc/nova/nova-cpu-1.conf'] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2591}} [ 562.304171] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2592}} [ 562.304296] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] ================================================================================ {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2594}} [ 562.304496] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] allow_resize_to_same_host = True {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.304660] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] arq_binding_timeout = 300 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.304788] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] backdoor_port = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.304910] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] backdoor_socket = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.305089] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] block_device_allocate_retries = 60 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.305250] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] block_device_allocate_retries_interval = 3 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.305414] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cert = self.pem {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.305576] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] compute_driver = vmwareapi.VMwareVCDriver {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.305744] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] compute_monitors = [] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.305903] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] config_dir = [] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.306080] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] config_drive_format = iso9660 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.306216] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.306379] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] config_source = [] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.306543] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] console_host = devstack {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.306700] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] control_exchange = nova {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.306855] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cpu_allocation_ratio = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.307019] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] daemon = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.307183] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] debug = True {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.307337] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] default_access_ip_network_name = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.307503] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] default_availability_zone = nova {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.307655] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] default_ephemeral_format = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.307887] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.308062] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] default_schedule_zone = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.308222] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] disk_allocation_ratio = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.308379] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] enable_new_services = True {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.308554] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] enabled_apis = ['osapi_compute'] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.308716] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] enabled_ssl_apis = [] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.308871] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] flat_injected = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.309038] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] force_config_drive = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.309199] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] force_raw_images = True {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.309363] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] graceful_shutdown_timeout = 5 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.309519] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] heal_instance_info_cache_interval = 60 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.309727] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] host = cpu-1 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.309892] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] initial_cpu_allocation_ratio = 4.0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.310065] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] initial_disk_allocation_ratio = 1.0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.310224] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] initial_ram_allocation_ratio = 1.0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.310428] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.310592] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] instance_build_timeout = 0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.310750] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] instance_delete_interval = 300 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.310914] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] instance_format = [instance: %(uuid)s] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.311087] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] instance_name_template = instance-%08x {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.311265] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] instance_usage_audit = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.311410] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] instance_usage_audit_period = month {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.311572] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.311734] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] instances_path = /opt/stack/data/nova/instances {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.311895] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] internal_service_availability_zone = internal {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.312060] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] key = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.312218] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] live_migration_retry_count = 30 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.312376] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] log_config_append = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.312538] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.312692] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] log_dir = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.312846] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] log_file = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.312969] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] log_options = True {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.313139] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] log_rotate_interval = 1 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.313316] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] log_rotate_interval_type = days {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.313480] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] log_rotation_type = none {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.313611] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.313736] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.313899] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.314070] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.314196] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.314355] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] long_rpc_timeout = 1800 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.314511] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] max_concurrent_builds = 10 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.314665] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] max_concurrent_live_migrations = 1 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.314820] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] max_concurrent_snapshots = 5 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.314974] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] max_local_block_devices = 3 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.315142] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] max_logfile_count = 30 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.315298] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] max_logfile_size_mb = 200 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.315452] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] maximum_instance_delete_attempts = 5 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.315612] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] metadata_listen = 0.0.0.0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.315781] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] metadata_listen_port = 8775 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.315947] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] metadata_workers = 2 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.316138] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] migrate_max_retries = -1 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.316306] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] mkisofs_cmd = genisoimage {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.316511] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] my_block_storage_ip = 10.180.1.21 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.316639] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] my_ip = 10.180.1.21 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.316797] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] network_allocate_retries = 0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.316971] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.317149] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] osapi_compute_listen = 0.0.0.0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.317311] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] osapi_compute_listen_port = 8774 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.317477] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] osapi_compute_unique_server_name_scope = {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.317641] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] osapi_compute_workers = 2 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.317796] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] password_length = 12 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.317949] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] periodic_enable = True {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.318120] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] periodic_fuzzy_delay = 60 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.318285] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] pointer_model = usbtablet {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.318450] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] preallocate_images = none {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.318604] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] publish_errors = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.318730] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] pybasedir = /opt/stack/nova {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.318882] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] ram_allocation_ratio = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.319046] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] rate_limit_burst = 0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.319210] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] rate_limit_except_level = CRITICAL {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.319373] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] rate_limit_interval = 0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.319530] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] reboot_timeout = 0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.319685] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] reclaim_instance_interval = 0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.319834] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] record = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.320000] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] reimage_timeout_per_gb = 60 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.320176] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] report_interval = 120 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.320334] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] rescue_timeout = 0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.320492] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] reserved_host_cpus = 0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.320644] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] reserved_host_disk_mb = 0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.320796] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] reserved_host_memory_mb = 512 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.320952] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] reserved_huge_pages = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.321123] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] resize_confirm_window = 0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.321279] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] resize_fs_using_block_device = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.321437] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] resume_guests_state_on_host_boot = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.321602] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.321760] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] rpc_response_timeout = 60 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.321914] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] run_external_periodic_tasks = True {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.322087] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] running_deleted_instance_action = reap {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.322245] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] running_deleted_instance_poll_interval = 1800 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.322396] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] running_deleted_instance_timeout = 0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.322551] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] scheduler_instance_sync_interval = 120 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.322681] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] service_down_time = 300 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.322846] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] servicegroup_driver = db {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.323009] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] shelved_offload_time = 0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.323172] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] shelved_poll_interval = 3600 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.323342] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] shutdown_timeout = 0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.323500] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] source_is_ipv6 = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.323654] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] ssl_only = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.323891] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] state_path = /opt/stack/data/n-cpu-1 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.324065] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] sync_power_state_interval = 600 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.324225] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] sync_power_state_pool_size = 1000 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.324390] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] syslog_log_facility = LOG_USER {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.324544] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] tempdir = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.324700] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] timeout_nbd = 10 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.324862] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] transport_url = **** {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.325029] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] update_resources_interval = 0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.325190] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] use_cow_images = True {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.325344] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] use_eventlog = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.325497] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] use_journal = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.325649] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] use_json = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.325799] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] use_rootwrap_daemon = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.325948] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] use_stderr = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.326111] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] use_syslog = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.326260] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vcpu_pin_set = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.326424] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vif_plugging_is_fatal = True {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.326584] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vif_plugging_timeout = 300 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.326745] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] virt_mkfs = [] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.326900] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] volume_usage_poll_interval = 0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.327067] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] watch_log_file = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.327233] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] web = /usr/share/spice-html5 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.327416] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_concurrency.disable_process_locking = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.327711] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_concurrency.lock_path = /opt/stack/data/n-cpu-1 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.327891] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.328067] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.328239] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_messaging_metrics.metrics_process_name = {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.328404] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.328568] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.328744] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] api.auth_strategy = keystone {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.328908] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] api.compute_link_prefix = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.329092] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.329267] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] api.dhcp_domain = novalocal {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.329433] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] api.enable_instance_password = True {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.329591] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] api.glance_link_prefix = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.329752] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.329917] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] api.instance_list_cells_batch_strategy = distributed {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.330090] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] api.instance_list_per_project_cells = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.330253] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] api.list_records_by_skipping_down_cells = True {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.330412] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] api.local_metadata_per_cell = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.330579] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] api.max_limit = 1000 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.330743] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] api.metadata_cache_expiration = 15 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.330913] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] api.neutron_default_tenant_id = default {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.331087] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] api.use_forwarded_for = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.331253] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] api.use_neutron_default_nets = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.331425] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.331584] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] api.vendordata_dynamic_failure_fatal = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.331747] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.331918] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] api.vendordata_dynamic_ssl_certfile = {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.332098] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] api.vendordata_dynamic_targets = [] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.332266] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] api.vendordata_jsonfile_path = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.332448] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] api.vendordata_providers = ['StaticJSON'] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.332635] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cache.backend = dogpile.cache.memcached {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.332800] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cache.backend_argument = **** {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.332969] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cache.config_prefix = cache.oslo {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.333150] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cache.dead_timeout = 60.0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.333321] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cache.debug_cache_backend = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.333474] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cache.enable_retry_client = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.333631] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cache.enable_socket_keepalive = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.333796] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cache.enabled = True {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.333955] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cache.expiration_time = 600 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.334129] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cache.hashclient_retry_attempts = 2 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.334293] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cache.hashclient_retry_delay = 1.0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.334452] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cache.memcache_dead_retry = 300 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.334717] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cache.memcache_password = {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.334783] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.334927] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.335095] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cache.memcache_pool_maxsize = 10 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.335256] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cache.memcache_pool_unused_timeout = 60 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.335410] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cache.memcache_sasl_enabled = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.335584] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cache.memcache_servers = ['localhost:11211'] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.335743] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cache.memcache_socket_timeout = 1.0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.335909] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cache.memcache_username = {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.336080] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cache.proxies = [] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.336243] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cache.retry_attempts = 2 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.336402] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cache.retry_delay = 0.0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.336822] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cache.socket_keepalive_count = 1 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.336822] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cache.socket_keepalive_idle = 1 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.336920] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cache.socket_keepalive_interval = 1 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.337025] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cache.tls_allowed_ciphers = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.337176] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cache.tls_cafile = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.337326] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cache.tls_certfile = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.337485] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cache.tls_enabled = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.337637] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cache.tls_keyfile = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.337802] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cinder.auth_section = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.337970] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cinder.auth_type = password {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.338137] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cinder.cafile = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.338308] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cinder.catalog_info = volumev3::publicURL {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.338472] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cinder.certfile = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.338631] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cinder.collect_timing = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.338787] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cinder.cross_az_attach = True {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.338944] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cinder.debug = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.339112] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cinder.endpoint_template = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.339271] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cinder.http_retries = 3 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.339429] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cinder.insecure = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.339578] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cinder.keyfile = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.339744] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cinder.os_region_name = RegionOne {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.339899] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cinder.split_loggers = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.340067] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cinder.timeout = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.340240] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.340397] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] compute.cpu_dedicated_set = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.340553] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] compute.cpu_shared_set = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.340714] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] compute.image_type_exclude_list = [] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.340881] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] compute.live_migration_wait_for_vif_plug = True {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.341052] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] compute.max_concurrent_disk_ops = 0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.341215] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] compute.max_disk_devices_to_attach = -1 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.341372] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.341540] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.341700] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] compute.resource_provider_association_refresh = 300 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.341858] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] compute.shutdown_retry_interval = 10 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.342043] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.342225] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] conductor.workers = 2 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.342398] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] console.allowed_origins = [] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.342557] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] console.ssl_ciphers = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.342724] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] console.ssl_minimum_version = default {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.342890] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] consoleauth.token_ttl = 600 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.343067] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cyborg.cafile = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.343223] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cyborg.certfile = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.343387] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cyborg.collect_timing = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.343543] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cyborg.connect_retries = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.343698] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cyborg.connect_retry_delay = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.343851] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cyborg.endpoint_override = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.344023] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cyborg.insecure = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.344170] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cyborg.keyfile = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.344329] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cyborg.max_version = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.344485] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cyborg.min_version = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.344639] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cyborg.region_name = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.344792] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cyborg.service_name = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.344956] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cyborg.service_type = accelerator {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.345127] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cyborg.split_loggers = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.345283] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cyborg.status_code_retries = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.345439] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cyborg.status_code_retry_delay = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.345590] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cyborg.timeout = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.345768] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.345923] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] cyborg.version = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.346114] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] database.backend = sqlalchemy {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.346295] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] database.connection = **** {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.346462] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] database.connection_debug = 0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.346627] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] database.connection_parameters = {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.346785] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] database.connection_recycle_time = 3600 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.346945] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] database.connection_trace = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.347114] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] database.db_inc_retry_interval = True {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.347275] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] database.db_max_retries = 20 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.347432] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] database.db_max_retry_interval = 10 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.347588] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] database.db_retry_interval = 1 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.347752] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] database.max_overflow = 50 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.347908] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] database.max_pool_size = 5 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.348083] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] database.max_retries = 10 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.348243] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] database.mysql_enable_ndb = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.348406] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] database.mysql_sql_mode = TRADITIONAL {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.348563] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] database.mysql_wsrep_sync_wait = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.348718] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] database.pool_timeout = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.348882] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] database.retry_interval = 10 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.349047] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] database.slave_connection = **** {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.349211] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] database.sqlite_synchronous = True {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.349367] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] database.use_db_reconnect = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.349543] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] api_database.backend = sqlalchemy {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.349715] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] api_database.connection = **** {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.349880] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] api_database.connection_debug = 0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.350057] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] api_database.connection_parameters = {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.350218] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] api_database.connection_recycle_time = 3600 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.350380] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] api_database.connection_trace = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.350805] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] api_database.db_inc_retry_interval = True {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.350978] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] api_database.db_max_retries = 20 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.351159] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] api_database.db_max_retry_interval = 10 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.351322] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] api_database.db_retry_interval = 1 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.351493] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] api_database.max_overflow = 50 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.351741] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] api_database.max_pool_size = 5 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.351850] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] api_database.max_retries = 10 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.351971] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] api_database.mysql_enable_ndb = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.352204] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.352300] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] api_database.mysql_wsrep_sync_wait = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.352669] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] api_database.pool_timeout = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.352853] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] api_database.retry_interval = 10 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.353064] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] api_database.slave_connection = **** {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.353239] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] api_database.sqlite_synchronous = True {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.353416] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] devices.enabled_mdev_types = [] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.353840] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.354032] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] ephemeral_storage_encryption.enabled = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.354208] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] ephemeral_storage_encryption.key_size = 512 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.354385] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] glance.api_servers = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.354553] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] glance.cafile = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.354717] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] glance.certfile = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.354880] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] glance.collect_timing = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.355050] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] glance.connect_retries = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.355215] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] glance.connect_retry_delay = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.355378] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] glance.debug = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.355545] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] glance.default_trusted_certificate_ids = [] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.355703] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] glance.enable_certificate_validation = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.355863] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] glance.enable_rbd_download = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.356026] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] glance.endpoint_override = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.356195] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] glance.insecure = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.356354] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] glance.keyfile = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.356510] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] glance.max_version = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.356664] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] glance.min_version = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.356873] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] glance.num_retries = 3 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.357057] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] glance.rbd_ceph_conf = {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.357224] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] glance.rbd_connect_timeout = 5 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.357393] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] glance.rbd_pool = {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.357561] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] glance.rbd_user = {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.357719] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] glance.region_name = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.357873] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] glance.service_name = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.358048] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] glance.service_type = image {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.358212] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] glance.split_loggers = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.358368] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] glance.status_code_retries = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.358525] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] glance.status_code_retry_delay = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.358685] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] glance.timeout = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.358864] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.359035] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] glance.verify_glance_signatures = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.359197] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] glance.version = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.359365] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] guestfs.debug = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.359534] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] hyperv.config_drive_cdrom = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.359732] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] hyperv.config_drive_inject_password = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.359912] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.360087] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] hyperv.enable_instance_metrics_collection = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.360251] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] hyperv.enable_remotefx = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.360420] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] hyperv.instances_path_share = {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.360586] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] hyperv.iscsi_initiator_list = [] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.360747] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] hyperv.limit_cpu_features = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.360910] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.361085] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.361258] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] hyperv.power_state_check_timeframe = 60 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.361422] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] hyperv.power_state_event_polling_interval = 2 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.361592] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.361756] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] hyperv.use_multipath_io = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.361959] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] hyperv.volume_attach_retry_count = 10 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.362088] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] hyperv.volume_attach_retry_interval = 5 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.362248] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] hyperv.vswitch_name = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.362410] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.362580] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] mks.enabled = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.362919] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.363117] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] image_cache.manager_interval = 2400 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.363305] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] image_cache.precache_concurrency = 1 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.363461] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] image_cache.remove_unused_base_images = True {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.363628] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.363790] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.363964] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] image_cache.subdirectory_name = _base {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.364152] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] ironic.api_max_retries = 60 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.364344] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] ironic.api_retry_interval = 2 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.364474] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] ironic.auth_section = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.364633] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] ironic.auth_type = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.364788] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] ironic.cafile = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.364944] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] ironic.certfile = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.365123] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] ironic.collect_timing = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.365284] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] ironic.connect_retries = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.365440] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] ironic.connect_retry_delay = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.365598] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] ironic.endpoint_override = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.365757] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] ironic.insecure = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.365911] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] ironic.keyfile = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.366078] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] ironic.max_version = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.366238] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] ironic.min_version = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.366395] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] ironic.partition_key = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.366558] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] ironic.peer_list = [] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.366716] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] ironic.region_name = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.366877] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] ironic.serial_console_state_timeout = 10 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.367041] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] ironic.service_name = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.367216] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] ironic.service_type = baremetal {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.367378] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] ironic.split_loggers = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.367669] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] ironic.status_code_retries = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.367711] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] ironic.status_code_retry_delay = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.367841] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] ironic.timeout = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.368065] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.368250] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] ironic.version = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.368435] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.368611] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] key_manager.fixed_key = **** {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.368791] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.368951] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] barbican.barbican_api_version = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.369122] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] barbican.barbican_endpoint = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.369295] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] barbican.barbican_endpoint_type = public {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.369454] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] barbican.barbican_region_name = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.369613] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] barbican.cafile = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.369771] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] barbican.certfile = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.369933] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] barbican.collect_timing = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.370105] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] barbican.insecure = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.370264] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] barbican.keyfile = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.370429] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] barbican.number_of_retries = 60 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.370598] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] barbican.retry_delay = 1 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.370761] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] barbican.send_service_user_token = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.370922] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] barbican.split_loggers = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.371090] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] barbican.timeout = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.371299] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] barbican.verify_ssl = True {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.371466] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] barbican.verify_ssl_path = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.371634] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] barbican_service_user.auth_section = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.371796] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] barbican_service_user.auth_type = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.372015] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] barbican_service_user.cafile = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.372197] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] barbican_service_user.certfile = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.372383] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] barbican_service_user.collect_timing = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.373109] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] barbican_service_user.insecure = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.373109] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] barbican_service_user.keyfile = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.373109] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] barbican_service_user.split_loggers = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.373109] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] barbican_service_user.timeout = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.373289] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vault.approle_role_id = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.373326] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vault.approle_secret_id = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.373577] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vault.cafile = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.373641] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vault.certfile = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.373822] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vault.collect_timing = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.374008] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vault.insecure = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.374184] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vault.keyfile = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.374762] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vault.kv_mountpoint = secret {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.374762] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vault.kv_version = 2 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.374762] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vault.namespace = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.374923] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vault.root_token_id = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.375039] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vault.split_loggers = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.375212] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vault.ssl_ca_crt_file = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.375382] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vault.timeout = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.375550] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vault.use_ssl = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.375720] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.375890] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] keystone.cafile = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.376062] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] keystone.certfile = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.376228] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] keystone.collect_timing = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.376388] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] keystone.connect_retries = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.376548] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] keystone.connect_retry_delay = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.376702] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] keystone.endpoint_override = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.376898] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] keystone.insecure = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.377080] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] keystone.keyfile = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.377239] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] keystone.max_version = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.377397] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] keystone.min_version = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.377558] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] keystone.region_name = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.377711] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] keystone.service_name = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.377877] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] keystone.service_type = identity {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.378048] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] keystone.split_loggers = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.378210] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] keystone.status_code_retries = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.378369] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] keystone.status_code_retry_delay = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.378529] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] keystone.timeout = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.378708] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.378879] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] keystone.version = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.379086] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.connection_uri = {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.379250] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.cpu_mode = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.379414] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.cpu_model_extra_flags = [] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.379583] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.cpu_models = [] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.379753] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.cpu_power_governor_high = performance {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.379921] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.cpu_power_governor_low = powersave {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.380095] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.cpu_power_management = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.380271] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.380436] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.device_detach_attempts = 8 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.380595] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.device_detach_timeout = 20 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.380757] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.disk_cachemodes = [] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.380955] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.disk_prefix = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.381135] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.enabled_perf_events = [] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.381298] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.file_backed_memory = 0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.381461] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.gid_maps = [] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.381617] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.hw_disk_discard = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.381770] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.hw_machine_type = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.381938] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.images_rbd_ceph_conf = {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.382111] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.382282] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.382452] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.images_rbd_glance_store_name = {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.382619] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.images_rbd_pool = rbd {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.382786] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.images_type = default {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.382942] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.images_volume_group = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.383114] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.inject_key = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.383391] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.inject_partition = -2 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.383430] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.inject_password = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.383584] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.iscsi_iface = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.383752] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.iser_use_multipath = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.383920] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.live_migration_bandwidth = 0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.384093] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.live_migration_completion_timeout = 800 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.384261] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.live_migration_downtime = 500 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.384421] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.live_migration_downtime_delay = 75 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.384578] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.live_migration_downtime_steps = 10 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.384734] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.live_migration_inbound_addr = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.384892] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.live_migration_permit_auto_converge = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.385122] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.live_migration_permit_post_copy = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.385224] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.live_migration_scheme = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.385404] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.live_migration_timeout_action = abort {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.385572] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.live_migration_tunnelled = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.385701] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.live_migration_uri = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.385863] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.live_migration_with_native_tls = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.386030] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.max_queues = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.386197] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.mem_stats_period_seconds = 10 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.386353] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.nfs_mount_options = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.386662] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.nfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.386834] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.num_aoe_discover_tries = 3 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.386998] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.num_iser_scan_tries = 5 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.387172] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.num_memory_encrypted_guests = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.387335] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.num_nvme_discover_tries = 5 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.387502] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.num_pcie_ports = 0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.387666] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.num_volume_scan_tries = 5 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.387828] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.pmem_namespaces = [] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.387982] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.quobyte_client_cfg = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.388290] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.quobyte_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.388481] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.rbd_connect_timeout = 5 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.388625] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.388787] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.389110] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.rbd_secret_uuid = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.389110] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.rbd_user = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.389264] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.realtime_scheduler_priority = 1 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.389435] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.remote_filesystem_transport = ssh {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.389594] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.rescue_image_id = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.389749] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.rescue_kernel_id = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.389903] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.rescue_ramdisk_id = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.390080] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.rng_dev_path = /dev/urandom {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.390239] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.rx_queue_size = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.390402] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.smbfs_mount_options = {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.390676] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.smbfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.390844] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.snapshot_compression = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.391014] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.snapshot_image_format = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.391237] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.391398] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.sparse_logical_volumes = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.391562] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.swtpm_enabled = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.391732] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.swtpm_group = tss {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.391901] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.swtpm_user = tss {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.392080] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.sysinfo_serial = unique {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.392242] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.tx_queue_size = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.392408] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.uid_maps = [] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.392570] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.use_virtio_for_bridges = True {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.392741] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.virt_type = kvm {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.392906] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.volume_clear = zero {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.393083] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.volume_clear_size = 0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.393253] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.volume_use_multipath = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.393411] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.vzstorage_cache_path = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.393580] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.393744] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.vzstorage_mount_group = qemu {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.393906] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.vzstorage_mount_opts = [] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.394096] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.394388] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.394562] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.vzstorage_mount_user = stack {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.394727] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.394899] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] neutron.auth_section = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.395081] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] neutron.auth_type = password {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.395248] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] neutron.cafile = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.395403] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] neutron.certfile = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.395565] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] neutron.collect_timing = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.395721] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] neutron.connect_retries = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.395878] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] neutron.connect_retry_delay = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.396056] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] neutron.default_floating_pool = public {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.396217] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] neutron.endpoint_override = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.396378] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] neutron.extension_sync_interval = 600 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.396541] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] neutron.http_retries = 3 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.396703] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] neutron.insecure = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.396857] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] neutron.keyfile = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.397047] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] neutron.max_version = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.397187] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] neutron.metadata_proxy_shared_secret = **** {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.397344] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] neutron.min_version = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.397511] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] neutron.ovs_bridge = br-int {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.397672] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] neutron.physnets = [] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.397838] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] neutron.region_name = RegionOne {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.398014] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] neutron.service_metadata_proxy = True {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.398174] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] neutron.service_name = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.398342] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] neutron.service_type = network {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.398507] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] neutron.split_loggers = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.398659] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] neutron.status_code_retries = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.398813] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] neutron.status_code_retry_delay = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.398967] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] neutron.timeout = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.399158] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.399316] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] neutron.version = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.399488] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] notifications.bdms_in_notifications = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.399663] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] notifications.default_level = INFO {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.401639] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] notifications.notification_format = unversioned {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.401639] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] notifications.notify_on_state_change = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.401639] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.401639] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] pci.alias = [] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.401639] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] pci.device_spec = [] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.401639] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] pci.report_in_placement = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.401898] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] placement.auth_section = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.401898] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] placement.auth_type = password {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.401898] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] placement.auth_url = http://10.180.1.21/identity {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.401898] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] placement.cafile = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.401898] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] placement.certfile = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.401898] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] placement.collect_timing = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.401898] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] placement.connect_retries = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.402143] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] placement.connect_retry_delay = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.402143] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] placement.default_domain_id = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.402212] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] placement.default_domain_name = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.402359] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] placement.domain_id = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.402514] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] placement.domain_name = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.402672] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] placement.endpoint_override = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.402834] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] placement.insecure = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.402988] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] placement.keyfile = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.403157] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] placement.max_version = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.403317] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] placement.min_version = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.403488] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] placement.password = **** {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.403646] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] placement.project_domain_id = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.403813] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] placement.project_domain_name = Default {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.403980] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] placement.project_id = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.404166] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] placement.project_name = service {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.404338] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] placement.region_name = RegionOne {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.404500] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] placement.service_name = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.404667] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] placement.service_type = placement {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.404830] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] placement.split_loggers = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.404989] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] placement.status_code_retries = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.405165] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] placement.status_code_retry_delay = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.405334] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] placement.system_scope = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.405495] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] placement.timeout = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.405653] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] placement.trust_id = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.405810] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] placement.user_domain_id = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.405975] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] placement.user_domain_name = Default {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.406154] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] placement.user_id = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.406336] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] placement.username = placement {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.406519] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.406679] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] placement.version = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.406852] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] quota.cores = 20 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.407025] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] quota.count_usage_from_placement = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.407200] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.407371] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] quota.injected_file_content_bytes = 10240 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.407537] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] quota.injected_file_path_length = 255 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.407700] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] quota.injected_files = 5 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.407865] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] quota.instances = 10 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.408040] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] quota.key_pairs = 100 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.408207] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] quota.metadata_items = 128 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.408371] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] quota.ram = 51200 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.408535] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] quota.recheck_quota = True {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.408701] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] quota.server_group_members = 10 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.408865] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] quota.server_groups = 10 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.409046] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] rdp.enabled = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.409363] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.409550] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.409716] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.409880] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] scheduler.image_metadata_prefilter = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.410053] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.410222] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] scheduler.max_attempts = 3 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.410384] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] scheduler.max_placement_results = 1000 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.410548] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.410709] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] scheduler.query_placement_for_availability_zone = True {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.410865] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] scheduler.query_placement_for_image_type_support = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.411038] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.411215] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] scheduler.workers = 2 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.411392] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.411563] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.411741] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.411911] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.412092] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.412262] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.412426] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.412612] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.412777] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] filter_scheduler.host_subset_size = 1 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.412937] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] filter_scheduler.image_properties_default_architecture = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.413110] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.413288] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] filter_scheduler.isolated_hosts = [] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.413443] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] filter_scheduler.isolated_images = [] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.413605] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] filter_scheduler.max_instances_per_host = 50 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.413761] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.413918] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] filter_scheduler.pci_in_placement = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.414090] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.414254] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.414411] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.414567] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.414724] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.414885] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.415059] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] filter_scheduler.track_instance_changes = True {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.415238] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.415409] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] metrics.required = True {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.415574] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] metrics.weight_multiplier = 1.0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.415731] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] metrics.weight_of_unavailable = -10000.0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.415895] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] metrics.weight_setting = [] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.416224] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.416401] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] serial_console.enabled = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.416582] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] serial_console.port_range = 10000:20000 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.416753] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.416921] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.417109] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] serial_console.serialproxy_port = 6083 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.417281] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] service_user.auth_section = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.417455] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] service_user.auth_type = password {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.417615] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] service_user.cafile = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.417772] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] service_user.certfile = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.417932] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] service_user.collect_timing = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.418104] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] service_user.insecure = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.418263] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] service_user.keyfile = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.418434] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] service_user.send_service_user_token = True {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.418597] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] service_user.split_loggers = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.418755] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] service_user.timeout = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.418922] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] spice.agent_enabled = True {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.419109] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] spice.enabled = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.419424] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.419619] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] spice.html5proxy_host = 0.0.0.0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.419791] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] spice.html5proxy_port = 6082 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.419952] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] spice.image_compression = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.420122] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] spice.jpeg_compression = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.420280] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] spice.playback_compression = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.420450] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] spice.server_listen = 127.0.0.1 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.420617] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.420774] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] spice.streaming_mode = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.420930] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] spice.zlib_compression = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.421111] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] upgrade_levels.baseapi = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.421273] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] upgrade_levels.cert = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.421445] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] upgrade_levels.compute = auto {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.421602] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] upgrade_levels.conductor = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.421767] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] upgrade_levels.scheduler = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.421936] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vendordata_dynamic_auth.auth_section = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.422112] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vendordata_dynamic_auth.auth_type = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.422273] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vendordata_dynamic_auth.cafile = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.422432] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vendordata_dynamic_auth.certfile = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.422594] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vendordata_dynamic_auth.collect_timing = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.422753] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vendordata_dynamic_auth.insecure = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.422910] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vendordata_dynamic_auth.keyfile = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.423085] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vendordata_dynamic_auth.split_loggers = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.423246] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vendordata_dynamic_auth.timeout = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.423423] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vmware.api_retry_count = 10 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.423587] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vmware.ca_file = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.423759] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vmware.cache_prefix = devstack-image-cache {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.423926] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vmware.cluster_name = testcl1 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.424104] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vmware.connection_pool_size = 10 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.424269] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vmware.console_delay_seconds = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.424436] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vmware.datastore_regex = ^datastore.* {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.424652] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vmware.host_ip = vc1.osci.c.eu-de-1.cloud.sap {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.424823] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vmware.host_password = **** {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.424991] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vmware.host_port = 443 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.425172] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vmware.host_username = administrator@vsphere.local {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.425346] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vmware.insecure = True {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.425509] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vmware.integration_bridge = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.425672] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vmware.maximum_objects = 100 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.425828] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vmware.pbm_default_policy = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.425987] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vmware.pbm_enabled = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.426154] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vmware.pbm_wsdl_location = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.426320] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.426481] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vmware.serial_port_proxy_uri = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.426636] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vmware.serial_port_service_uri = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.426802] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vmware.task_poll_interval = 0.5 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.426970] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vmware.use_linked_clone = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.427149] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vmware.vnc_keymap = en-us {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.427314] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vmware.vnc_port = 5900 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.427479] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vmware.vnc_port_total = 10000 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.427662] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vnc.auth_schemes = ['none'] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.427837] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vnc.enabled = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.428166] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.428355] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.428532] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vnc.novncproxy_port = 6080 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.428714] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vnc.server_listen = 127.0.0.1 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.428892] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vnc.server_proxyclient_address = 127.0.0.1 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.429067] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vnc.vencrypt_ca_certs = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.429232] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vnc.vencrypt_client_cert = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.429391] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vnc.vencrypt_client_key = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.429576] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.429739] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] workarounds.disable_deep_image_inspection = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.429900] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] workarounds.disable_fallback_pcpu_query = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.430080] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] workarounds.disable_group_policy_check_upcall = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.430244] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.430406] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] workarounds.disable_rootwrap = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.430569] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] workarounds.enable_numa_live_migration = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.430728] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.430888] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.431059] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] workarounds.handle_virt_lifecycle_events = True {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.431224] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] workarounds.libvirt_disable_apic = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.431380] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] workarounds.never_download_image_if_on_rbd = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.431544] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.431700] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.431860] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.432029] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.432183] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.432342] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.432500] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.432655] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.432816] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.432996] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.433176] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] wsgi.client_socket_timeout = 900 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.433345] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] wsgi.default_pool_size = 1000 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.433509] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] wsgi.keep_alive = True {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.433672] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] wsgi.max_header_line = 16384 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.433834] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] wsgi.secure_proxy_ssl_header = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.433995] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] wsgi.ssl_ca_file = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.434166] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] wsgi.ssl_cert_file = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.434326] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] wsgi.ssl_key_file = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.434491] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] wsgi.tcp_keepidle = 600 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.434660] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.434822] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] zvm.ca_file = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.434981] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] zvm.cloud_connector_url = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.435297] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] zvm.image_tmp_path = /opt/stack/data/n-cpu-1/images {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.435470] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] zvm.reachable_timeout = 300 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.435649] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_policy.enforce_new_defaults = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.435817] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_policy.enforce_scope = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.435988] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_policy.policy_default_rule = default {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.436183] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.436356] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_policy.policy_file = policy.yaml {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.436527] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.436685] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.436842] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.436999] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.437172] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.437339] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.437514] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.437689] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] profiler.connection_string = messaging:// {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.437855] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] profiler.enabled = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.438038] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] profiler.es_doc_type = notification {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.438205] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] profiler.es_scroll_size = 10000 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.438372] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] profiler.es_scroll_time = 2m {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.438535] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] profiler.filter_error_trace = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.438701] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] profiler.hmac_keys = SECRET_KEY {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.438865] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] profiler.sentinel_service_name = mymaster {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.439045] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] profiler.socket_timeout = 0.1 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.439209] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] profiler.trace_sqlalchemy = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.439376] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] remote_debug.host = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.439537] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] remote_debug.port = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.439713] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.439875] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.440048] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.440214] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.440377] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.440539] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.440696] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.440854] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.441024] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.441185] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.441351] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.441517] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.441682] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.441847] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.442019] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.442195] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.442357] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.442520] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.442683] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.442844] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.443014] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.443250] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.443342] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.443500] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.443668] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.443828] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_messaging_rabbit.ssl = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.443996] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.444175] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.444339] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.444504] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.444672] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_messaging_rabbit.ssl_version = {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.444861] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.445033] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_messaging_notifications.retry = -1 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.445217] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.445390] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_messaging_notifications.transport_url = **** {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.445562] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_limit.auth_section = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.445726] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_limit.auth_type = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.445885] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_limit.cafile = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.446052] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_limit.certfile = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.446215] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_limit.collect_timing = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.446370] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_limit.connect_retries = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.446526] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_limit.connect_retry_delay = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.446680] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_limit.endpoint_id = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.446835] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_limit.endpoint_override = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.446993] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_limit.insecure = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.447160] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_limit.keyfile = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.447314] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_limit.max_version = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.447470] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_limit.min_version = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.447621] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_limit.region_name = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.447772] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_limit.service_name = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.447928] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_limit.service_type = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.448096] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_limit.split_loggers = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.448255] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_limit.status_code_retries = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.448423] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_limit.status_code_retry_delay = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.448579] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_limit.timeout = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.448733] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_limit.valid_interfaces = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.448889] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_limit.version = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.449063] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_reports.file_event_handler = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.449228] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_reports.file_event_handler_interval = 1 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.449383] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] oslo_reports.log_dir = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.449553] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.449710] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vif_plug_linux_bridge_privileged.group = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.449864] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.450035] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.450200] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vif_plug_linux_bridge_privileged.thread_pool_size = 8 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.450354] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vif_plug_linux_bridge_privileged.user = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.450524] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.450679] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vif_plug_ovs_privileged.group = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.450835] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vif_plug_ovs_privileged.helper_command = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.450998] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.451173] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vif_plug_ovs_privileged.thread_pool_size = 8 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.451328] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] vif_plug_ovs_privileged.user = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.451495] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] os_vif_linux_bridge.flat_interface = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.451671] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.451839] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.452010] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.452184] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.452346] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.452512] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.452670] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] os_vif_linux_bridge.vlan_interface = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.452842] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] os_vif_ovs.isolate_vif = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.453013] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.453185] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.453376] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.453520] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] os_vif_ovs.ovsdb_interface = native {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.453679] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] os_vif_ovs.per_port_bridge = False {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.453840] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] os_brick.lock_path = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.454009] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] privsep_osbrick.capabilities = [21] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.454171] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] privsep_osbrick.group = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.454327] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] privsep_osbrick.helper_command = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.454490] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.454649] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] privsep_osbrick.thread_pool_size = 8 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.454802] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] privsep_osbrick.user = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.454968] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.455137] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] nova_sys_admin.group = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.455295] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] nova_sys_admin.helper_command = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.455457] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.455615] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] nova_sys_admin.thread_pool_size = 8 {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.455769] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] nova_sys_admin.user = None {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.455898] env[59534]: DEBUG oslo_service.service [None req-8ba7e884-53fb-42db-92a9-f5f46accbbb1 None None] ******************************************************************************** {{(pid=59534) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2613}} [ 562.456314] env[59534]: INFO nova.service [-] Starting compute node (version 0.1.0) [ 562.471244] env[59534]: INFO nova.virt.node [None req-3c7011bb-afae-43db-9e03-3d8cdd802cd4 None None] Generated node identity 7c9b9790-f1a0-47dd-a54c-c74c172308d9 [ 562.471493] env[59534]: INFO nova.virt.node [None req-3c7011bb-afae-43db-9e03-3d8cdd802cd4 None None] Wrote node identity 7c9b9790-f1a0-47dd-a54c-c74c172308d9 to /opt/stack/data/n-cpu-1/compute_id [ 562.482962] env[59534]: WARNING nova.compute.manager [None req-3c7011bb-afae-43db-9e03-3d8cdd802cd4 None None] Compute nodes ['7c9b9790-f1a0-47dd-a54c-c74c172308d9'] for host cpu-1 were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. [ 562.515049] env[59534]: INFO nova.compute.manager [None req-3c7011bb-afae-43db-9e03-3d8cdd802cd4 None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host [ 562.538551] env[59534]: WARNING nova.compute.manager [None req-3c7011bb-afae-43db-9e03-3d8cdd802cd4 None None] No compute node record found for host cpu-1. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host cpu-1 could not be found. [ 562.538783] env[59534]: DEBUG oslo_concurrency.lockutils [None req-3c7011bb-afae-43db-9e03-3d8cdd802cd4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 562.538984] env[59534]: DEBUG oslo_concurrency.lockutils [None req-3c7011bb-afae-43db-9e03-3d8cdd802cd4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 562.539144] env[59534]: DEBUG oslo_concurrency.lockutils [None req-3c7011bb-afae-43db-9e03-3d8cdd802cd4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 562.539346] env[59534]: DEBUG nova.compute.resource_tracker [None req-3c7011bb-afae-43db-9e03-3d8cdd802cd4 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59534) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 562.540424] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f7b192c-891b-42ab-acff-f92994ea22ba {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.549165] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f04b588-702a-4c3a-b9cc-7679f49e6403 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.563326] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-642947ac-e78c-48a5-8be2-8d631cefdf57 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.569190] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b017582c-0141-4a9f-bb88-f239f302f1b8 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.596985] env[59534]: DEBUG nova.compute.resource_tracker [None req-3c7011bb-afae-43db-9e03-3d8cdd802cd4 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181509MB free_disk=136GB free_vcpus=48 pci_devices=None {{(pid=59534) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 562.597148] env[59534]: DEBUG oslo_concurrency.lockutils [None req-3c7011bb-afae-43db-9e03-3d8cdd802cd4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 562.597322] env[59534]: DEBUG oslo_concurrency.lockutils [None req-3c7011bb-afae-43db-9e03-3d8cdd802cd4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 562.608437] env[59534]: WARNING nova.compute.resource_tracker [None req-3c7011bb-afae-43db-9e03-3d8cdd802cd4 None None] No compute node record for cpu-1:7c9b9790-f1a0-47dd-a54c-c74c172308d9: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 7c9b9790-f1a0-47dd-a54c-c74c172308d9 could not be found. [ 562.620197] env[59534]: INFO nova.compute.resource_tracker [None req-3c7011bb-afae-43db-9e03-3d8cdd802cd4 None None] Compute node record created for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 with uuid: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 [ 562.667327] env[59534]: DEBUG nova.compute.resource_tracker [None req-3c7011bb-afae-43db-9e03-3d8cdd802cd4 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=59534) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 562.667490] env[59534]: DEBUG nova.compute.resource_tracker [None req-3c7011bb-afae-43db-9e03-3d8cdd802cd4 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=59534) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 562.762587] env[59534]: INFO nova.scheduler.client.report [None req-3c7011bb-afae-43db-9e03-3d8cdd802cd4 None None] [req-e7bf21ea-4632-4faa-8333-001caf4e72f9] Created resource provider record via placement API for resource provider with UUID 7c9b9790-f1a0-47dd-a54c-c74c172308d9 and name domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28. [ 562.778437] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc770e94-947d-4bdb-985c-b19ffe7d5142 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.785800] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b17479d-da46-4489-a974-2d62d164826f {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.814478] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-372be29c-321c-4e54-af58-927b21c1ba63 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.822072] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98b7d7ec-2901-4277-aadc-74720ccfb453 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.836258] env[59534]: DEBUG nova.compute.provider_tree [None req-3c7011bb-afae-43db-9e03-3d8cdd802cd4 None None] Updating inventory in ProviderTree for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 562.879180] env[59534]: DEBUG nova.scheduler.client.report [None req-3c7011bb-afae-43db-9e03-3d8cdd802cd4 None None] Updated inventory for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 with generation 0 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 562.879416] env[59534]: DEBUG nova.compute.provider_tree [None req-3c7011bb-afae-43db-9e03-3d8cdd802cd4 None None] Updating resource provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 generation from 0 to 1 during operation: update_inventory {{(pid=59534) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 562.879561] env[59534]: DEBUG nova.compute.provider_tree [None req-3c7011bb-afae-43db-9e03-3d8cdd802cd4 None None] Updating inventory in ProviderTree for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 562.919229] env[59534]: DEBUG nova.compute.provider_tree [None req-3c7011bb-afae-43db-9e03-3d8cdd802cd4 None None] Updating resource provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 generation from 1 to 2 during operation: update_traits {{(pid=59534) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 562.935433] env[59534]: DEBUG nova.compute.resource_tracker [None req-3c7011bb-afae-43db-9e03-3d8cdd802cd4 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59534) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 562.935604] env[59534]: DEBUG oslo_concurrency.lockutils [None req-3c7011bb-afae-43db-9e03-3d8cdd802cd4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.338s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 562.935757] env[59534]: DEBUG nova.service [None req-3c7011bb-afae-43db-9e03-3d8cdd802cd4 None None] Creating RPC server for service compute {{(pid=59534) start /opt/stack/nova/nova/service.py:182}} [ 562.949307] env[59534]: DEBUG nova.service [None req-3c7011bb-afae-43db-9e03-3d8cdd802cd4 None None] Join ServiceGroup membership for this service compute {{(pid=59534) start /opt/stack/nova/nova/service.py:199}} [ 562.949487] env[59534]: DEBUG nova.servicegroup.drivers.db [None req-3c7011bb-afae-43db-9e03-3d8cdd802cd4 None None] DB_Driver: join new ServiceGroup member cpu-1 to the compute group, service = {{(pid=59534) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 580.955452] env[59534]: DEBUG oslo_service.periodic_task [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Running periodic task ComputeManager._sync_power_states {{(pid=59534) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 580.966069] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Getting list of instances from cluster (obj){ [ 580.966069] env[59534]: value = "domain-c8" [ 580.966069] env[59534]: _type = "ClusterComputeResource" [ 580.966069] env[59534]: } {{(pid=59534) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 580.967248] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6fc2b7f-de26-472c-a3e6-74d39d3c772a {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 580.976153] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Got total of 0 instances {{(pid=59534) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 580.976366] env[59534]: DEBUG oslo_service.periodic_task [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=59534) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 580.976657] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Getting list of instances from cluster (obj){ [ 580.976657] env[59534]: value = "domain-c8" [ 580.976657] env[59534]: _type = "ClusterComputeResource" [ 580.976657] env[59534]: } {{(pid=59534) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 580.977468] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2debf46-c7dc-428c-9d4c-14da8ea1db3b {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 580.984584] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Got total of 0 instances {{(pid=59534) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 607.941254] env[59534]: DEBUG oslo_concurrency.lockutils [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] Acquiring lock "8b13a5ee-6538-4eb9-9ae6-b3a960a958f2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 607.941548] env[59534]: DEBUG oslo_concurrency.lockutils [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] Lock "8b13a5ee-6538-4eb9-9ae6-b3a960a958f2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 607.961226] env[59534]: DEBUG nova.compute.manager [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] Starting instance... {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 608.077618] env[59534]: DEBUG oslo_concurrency.lockutils [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 608.077618] env[59534]: DEBUG oslo_concurrency.lockutils [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 608.079244] env[59534]: INFO nova.compute.claims [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 608.152935] env[59534]: DEBUG oslo_concurrency.lockutils [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] Acquiring lock "f42f768c-09ad-44b6-b294-681f392bb483" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 608.153208] env[59534]: DEBUG oslo_concurrency.lockutils [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] Lock "f42f768c-09ad-44b6-b294-681f392bb483" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 608.173598] env[59534]: DEBUG nova.compute.manager [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] [instance: f42f768c-09ad-44b6-b294-681f392bb483] Starting instance... {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 608.249097] env[59534]: DEBUG oslo_concurrency.lockutils [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 608.281453] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc34fa21-b7a6-45f0-8b57-624a9216d14e {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.293816] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ec479e9-2fcc-44eb-86d0-9a14e8bd0d6e {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.335213] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4fb3518-40fb-4331-8ed1-0e0f27e536fd {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.344320] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b90a5e98-98fa-43cb-9403-feea7c7ab6c8 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.359356] env[59534]: DEBUG nova.compute.provider_tree [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 608.375972] env[59534]: DEBUG nova.scheduler.client.report [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 608.394116] env[59534]: DEBUG oslo_concurrency.lockutils [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.317s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 608.397653] env[59534]: DEBUG nova.compute.manager [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] Start building networks asynchronously for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 608.401235] env[59534]: DEBUG oslo_concurrency.lockutils [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.152s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 608.404989] env[59534]: INFO nova.compute.claims [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] [instance: f42f768c-09ad-44b6-b294-681f392bb483] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 608.447326] env[59534]: DEBUG nova.compute.utils [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] Using /dev/sd instead of None {{(pid=59534) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 608.451178] env[59534]: DEBUG nova.compute.manager [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] Allocating IP information in the background. {{(pid=59534) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 608.451432] env[59534]: DEBUG nova.network.neutron [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] allocate_for_instance() {{(pid=59534) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 608.469759] env[59534]: DEBUG nova.compute.manager [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] Start building block device mappings for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 608.536709] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-009e8338-122d-46c0-b35b-951cc1a2e36c {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.548265] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd8d904c-8a38-42de-ac70-095b3a818b4a {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.592592] env[59534]: DEBUG nova.compute.manager [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] Start spawning the instance on the hypervisor. {{(pid=59534) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 608.595392] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb65f811-ac92-457b-9c91-b169fabfbef0 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.604825] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84c07aa2-3a22-4f2c-89d8-aa43b5525bba {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.623216] env[59534]: DEBUG nova.compute.provider_tree [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 608.634685] env[59534]: DEBUG nova.scheduler.client.report [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 608.654012] env[59534]: DEBUG oslo_concurrency.lockutils [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.253s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 608.654679] env[59534]: DEBUG nova.compute.manager [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] [instance: f42f768c-09ad-44b6-b294-681f392bb483] Start building networks asynchronously for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 608.677281] env[59534]: DEBUG nova.virt.hardware [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T19:44:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T19:44:03Z,direct_url=,disk_format='vmdk',id=ca8542a2-3ba7-4624-b2be-cd49a340ac21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='31a329da9a7b4c98a2734a8492bf7dec',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T19:44:04Z,virtual_size=,visibility=), allow threads: False {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 608.677507] env[59534]: DEBUG nova.virt.hardware [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] Flavor limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 608.677996] env[59534]: DEBUG nova.virt.hardware [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] Image limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 608.678217] env[59534]: DEBUG nova.virt.hardware [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] Flavor pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 608.678360] env[59534]: DEBUG nova.virt.hardware [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] Image pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 608.678512] env[59534]: DEBUG nova.virt.hardware [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 608.678771] env[59534]: DEBUG nova.virt.hardware [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 608.678944] env[59534]: DEBUG nova.virt.hardware [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 608.679311] env[59534]: DEBUG nova.virt.hardware [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] Got 1 possible topologies {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 608.679475] env[59534]: DEBUG nova.virt.hardware [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 608.679639] env[59534]: DEBUG nova.virt.hardware [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 608.680622] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e782f46-5c55-4fe0-a686-6327843be5ec {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.688841] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d9d3ffb-e27a-44d8-bd9f-198048376752 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.695662] env[59534]: DEBUG nova.compute.utils [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] Using /dev/sd instead of None {{(pid=59534) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 608.697315] env[59534]: DEBUG nova.compute.manager [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] [instance: f42f768c-09ad-44b6-b294-681f392bb483] Allocating IP information in the background. {{(pid=59534) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 608.697538] env[59534]: DEBUG nova.network.neutron [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] [instance: f42f768c-09ad-44b6-b294-681f392bb483] allocate_for_instance() {{(pid=59534) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 608.711278] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2cc38730-0d34-4bc6-acd3-9ee3883b14bb {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.724102] env[59534]: DEBUG nova.compute.manager [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] [instance: f42f768c-09ad-44b6-b294-681f392bb483] Start building block device mappings for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 608.789919] env[59534]: DEBUG nova.compute.manager [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] [instance: f42f768c-09ad-44b6-b294-681f392bb483] Start spawning the instance on the hypervisor. {{(pid=59534) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 608.815185] env[59534]: DEBUG nova.virt.hardware [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T19:44:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T19:44:03Z,direct_url=,disk_format='vmdk',id=ca8542a2-3ba7-4624-b2be-cd49a340ac21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='31a329da9a7b4c98a2734a8492bf7dec',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T19:44:04Z,virtual_size=,visibility=), allow threads: False {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 608.815328] env[59534]: DEBUG nova.virt.hardware [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] Flavor limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 608.815414] env[59534]: DEBUG nova.virt.hardware [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] Image limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 608.815799] env[59534]: DEBUG nova.virt.hardware [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] Flavor pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 608.815799] env[59534]: DEBUG nova.virt.hardware [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] Image pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 608.815872] env[59534]: DEBUG nova.virt.hardware [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 608.816116] env[59534]: DEBUG nova.virt.hardware [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 608.816299] env[59534]: DEBUG nova.virt.hardware [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 608.816475] env[59534]: DEBUG nova.virt.hardware [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] Got 1 possible topologies {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 608.816632] env[59534]: DEBUG nova.virt.hardware [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 608.816802] env[59534]: DEBUG nova.virt.hardware [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 608.817676] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80609eec-de72-463d-ac8a-96c3756cc416 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.826994] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98b5c1f0-7c45-45bc-a7d1-e63c700bfc44 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 609.156398] env[59534]: DEBUG nova.policy [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3bfa18770e114e418c7f762f5a50586d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '26daced2efba4563aad36c2b98b1cc85', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59534) authorize /opt/stack/nova/nova/policy.py:203}} [ 609.209610] env[59534]: DEBUG nova.policy [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c61839002ee04097901f9a4592f40730', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fceda5af1ddf4ce7ba350329af14baf7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59534) authorize /opt/stack/nova/nova/policy.py:203}} [ 609.328559] env[59534]: DEBUG oslo_concurrency.lockutils [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] Acquiring lock "c4837d87-be47-4a47-9703-f73eeddb0053" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 609.329132] env[59534]: DEBUG oslo_concurrency.lockutils [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] Lock "c4837d87-be47-4a47-9703-f73eeddb0053" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 609.344473] env[59534]: DEBUG nova.compute.manager [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] [instance: c4837d87-be47-4a47-9703-f73eeddb0053] Starting instance... {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 609.409371] env[59534]: DEBUG oslo_concurrency.lockutils [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 609.409672] env[59534]: DEBUG oslo_concurrency.lockutils [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 609.411095] env[59534]: INFO nova.compute.claims [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] [instance: c4837d87-be47-4a47-9703-f73eeddb0053] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 609.563747] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-462fa712-5cea-43ee-a8e2-a774988b3c50 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 609.575146] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35774b07-bd4f-481d-a365-0e814371c62b {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 609.606868] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a7c97e5-68c2-451d-b97b-9a3aeecf021f {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 609.613500] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74d39193-7c8d-4a39-8daf-c116d85ec16d {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 609.638203] env[59534]: DEBUG nova.compute.provider_tree [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 609.654566] env[59534]: DEBUG nova.scheduler.client.report [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 609.669783] env[59534]: DEBUG oslo_concurrency.lockutils [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.260s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 609.670346] env[59534]: DEBUG nova.compute.manager [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] [instance: c4837d87-be47-4a47-9703-f73eeddb0053] Start building networks asynchronously for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 609.710579] env[59534]: DEBUG nova.compute.utils [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] Using /dev/sd instead of None {{(pid=59534) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 609.712141] env[59534]: DEBUG nova.compute.manager [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] [instance: c4837d87-be47-4a47-9703-f73eeddb0053] Allocating IP information in the background. {{(pid=59534) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 609.712366] env[59534]: DEBUG nova.network.neutron [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] [instance: c4837d87-be47-4a47-9703-f73eeddb0053] allocate_for_instance() {{(pid=59534) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 609.722431] env[59534]: DEBUG nova.compute.manager [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] [instance: c4837d87-be47-4a47-9703-f73eeddb0053] Start building block device mappings for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 609.817463] env[59534]: DEBUG oslo_concurrency.lockutils [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] Acquiring lock "0f052eb8-59d7-4dcb-9d2f-bc7740424a66" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 609.817800] env[59534]: DEBUG oslo_concurrency.lockutils [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] Lock "0f052eb8-59d7-4dcb-9d2f-bc7740424a66" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 609.819623] env[59534]: DEBUG nova.compute.manager [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] [instance: c4837d87-be47-4a47-9703-f73eeddb0053] Start spawning the instance on the hypervisor. {{(pid=59534) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 609.834034] env[59534]: DEBUG nova.compute.manager [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] Starting instance... {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 609.855894] env[59534]: DEBUG nova.virt.hardware [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T19:44:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T19:44:03Z,direct_url=,disk_format='vmdk',id=ca8542a2-3ba7-4624-b2be-cd49a340ac21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='31a329da9a7b4c98a2734a8492bf7dec',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T19:44:04Z,virtual_size=,visibility=), allow threads: False {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 609.856392] env[59534]: DEBUG nova.virt.hardware [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] Flavor limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 609.856688] env[59534]: DEBUG nova.virt.hardware [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] Image limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 609.857398] env[59534]: DEBUG nova.virt.hardware [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] Flavor pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 609.857398] env[59534]: DEBUG nova.virt.hardware [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] Image pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 609.857398] env[59534]: DEBUG nova.virt.hardware [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 609.857586] env[59534]: DEBUG nova.virt.hardware [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 609.857786] env[59534]: DEBUG nova.virt.hardware [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 609.858093] env[59534]: DEBUG nova.virt.hardware [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] Got 1 possible topologies {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 609.858223] env[59534]: DEBUG nova.virt.hardware [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 609.858422] env[59534]: DEBUG nova.virt.hardware [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 609.860260] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2022dce-ad20-422f-a9f8-8515d721f702 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 609.869329] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89cb90ad-39df-4372-9edd-7d7c216d36c2 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 609.909466] env[59534]: DEBUG oslo_concurrency.lockutils [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 609.909706] env[59534]: DEBUG oslo_concurrency.lockutils [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 609.911605] env[59534]: INFO nova.compute.claims [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 610.038756] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09015352-8e99-409f-a50b-b27517d24789 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 610.046838] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85d4a5b9-42f1-4a6f-8dfa-5d26c9a34edf {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 610.078816] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-608a4a31-d36d-484d-a5cd-53eab237a180 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 610.086874] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b69561f1-c237-41ac-ad7b-073b220eb8e0 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 610.101303] env[59534]: DEBUG nova.compute.provider_tree [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 610.110928] env[59534]: DEBUG nova.scheduler.client.report [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 610.129167] env[59534]: DEBUG oslo_concurrency.lockutils [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.219s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 610.130105] env[59534]: DEBUG nova.compute.manager [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] Start building networks asynchronously for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 610.174383] env[59534]: DEBUG nova.compute.utils [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] Using /dev/sd instead of None {{(pid=59534) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 610.176009] env[59534]: DEBUG nova.compute.manager [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] Allocating IP information in the background. {{(pid=59534) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 610.176766] env[59534]: DEBUG nova.network.neutron [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] allocate_for_instance() {{(pid=59534) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 610.186381] env[59534]: DEBUG nova.compute.manager [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] Start building block device mappings for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 610.263729] env[59534]: DEBUG nova.compute.manager [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] Start spawning the instance on the hypervisor. {{(pid=59534) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 610.292313] env[59534]: DEBUG nova.virt.hardware [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T19:44:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T19:44:03Z,direct_url=,disk_format='vmdk',id=ca8542a2-3ba7-4624-b2be-cd49a340ac21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='31a329da9a7b4c98a2734a8492bf7dec',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T19:44:04Z,virtual_size=,visibility=), allow threads: False {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 610.292313] env[59534]: DEBUG nova.virt.hardware [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] Flavor limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 610.292313] env[59534]: DEBUG nova.virt.hardware [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] Image limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 610.292719] env[59534]: DEBUG nova.virt.hardware [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] Flavor pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 610.292719] env[59534]: DEBUG nova.virt.hardware [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] Image pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 610.292719] env[59534]: DEBUG nova.virt.hardware [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 610.292956] env[59534]: DEBUG nova.virt.hardware [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 610.296750] env[59534]: DEBUG nova.virt.hardware [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 610.297397] env[59534]: DEBUG nova.virt.hardware [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] Got 1 possible topologies {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 610.297658] env[59534]: DEBUG nova.virt.hardware [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 610.297937] env[59534]: DEBUG nova.virt.hardware [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 610.298917] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43b49a31-7b99-4b75-b129-ac7076adc9fb {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 610.309434] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1165c3bd-69d5-42c0-ace1-132260aff280 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 610.418720] env[59534]: DEBUG nova.policy [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7462a926c5694f6fa44d21bbb28c35e7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '50174144e5d848ab94ee41b0e9cdc2d5', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59534) authorize /opt/stack/nova/nova/policy.py:203}} [ 610.712828] env[59534]: DEBUG nova.policy [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '362a6736537c41b7988be4efea3d928f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f35535d2e68e42dd87b1629dc319e460', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59534) authorize /opt/stack/nova/nova/policy.py:203}} [ 611.840902] env[59534]: DEBUG nova.network.neutron [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] Successfully created port: 06045837-635b-4a38-bd57-9b1d16b054c1 {{(pid=59534) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 611.860098] env[59534]: DEBUG nova.network.neutron [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] [instance: f42f768c-09ad-44b6-b294-681f392bb483] Successfully created port: 01c27392-b745-429b-b2c0-4d8df0c419a8 {{(pid=59534) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 613.535042] env[59534]: DEBUG oslo_concurrency.lockutils [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] Acquiring lock "734b416a-659c-451b-82ea-0b8a2796fffd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 613.535339] env[59534]: DEBUG oslo_concurrency.lockutils [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] Lock "734b416a-659c-451b-82ea-0b8a2796fffd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 613.550641] env[59534]: DEBUG nova.compute.manager [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] Starting instance... {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 613.613239] env[59534]: DEBUG oslo_concurrency.lockutils [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 613.613477] env[59534]: DEBUG oslo_concurrency.lockutils [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 613.620258] env[59534]: INFO nova.compute.claims [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 613.787137] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-120038ed-188c-414e-965a-9cc1bec8fb44 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 613.799437] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a58be422-16fd-4e4c-9f05-339013b99353 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 613.834185] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-989254ea-65ce-40f1-acd8-64dd1462a3a9 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 613.843713] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8ab2679-c0a1-41af-88a6-bd6651b115a4 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 613.859139] env[59534]: DEBUG nova.compute.provider_tree [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 613.868150] env[59534]: DEBUG nova.scheduler.client.report [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 613.888207] env[59534]: DEBUG oslo_concurrency.lockutils [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.275s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 613.888657] env[59534]: DEBUG nova.compute.manager [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] Start building networks asynchronously for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 613.937560] env[59534]: DEBUG nova.compute.utils [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] Using /dev/sd instead of None {{(pid=59534) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 613.938836] env[59534]: DEBUG nova.compute.manager [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] Allocating IP information in the background. {{(pid=59534) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 613.939021] env[59534]: DEBUG nova.network.neutron [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] allocate_for_instance() {{(pid=59534) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 613.955894] env[59534]: DEBUG nova.compute.manager [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] Start building block device mappings for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 614.036883] env[59534]: DEBUG nova.network.neutron [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] Successfully created port: 00059401-5bcb-4b4e-a1d4-788f34da675b {{(pid=59534) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 614.062671] env[59534]: DEBUG nova.compute.manager [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] Start spawning the instance on the hypervisor. {{(pid=59534) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 614.088809] env[59534]: DEBUG nova.virt.hardware [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T19:44:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T19:44:03Z,direct_url=,disk_format='vmdk',id=ca8542a2-3ba7-4624-b2be-cd49a340ac21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='31a329da9a7b4c98a2734a8492bf7dec',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T19:44:04Z,virtual_size=,visibility=), allow threads: False {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 614.089123] env[59534]: DEBUG nova.virt.hardware [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] Flavor limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 614.089321] env[59534]: DEBUG nova.virt.hardware [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] Image limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 614.089508] env[59534]: DEBUG nova.virt.hardware [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] Flavor pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 614.089648] env[59534]: DEBUG nova.virt.hardware [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] Image pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 614.090297] env[59534]: DEBUG nova.virt.hardware [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 614.090297] env[59534]: DEBUG nova.virt.hardware [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 614.090583] env[59534]: DEBUG nova.virt.hardware [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 614.090665] env[59534]: DEBUG nova.virt.hardware [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] Got 1 possible topologies {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 614.090801] env[59534]: DEBUG nova.virt.hardware [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 614.091320] env[59534]: DEBUG nova.virt.hardware [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 614.091872] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b86d1b02-fd3a-466f-a681-e6e0a853852d {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 614.101489] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62498af3-0a35-44f0-83d3-066619a6deb6 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 614.249705] env[59534]: DEBUG nova.network.neutron [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] [instance: c4837d87-be47-4a47-9703-f73eeddb0053] Successfully created port: b6831d1f-22a2-4cdf-84ad-ea32b3046dce {{(pid=59534) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 614.528409] env[59534]: DEBUG nova.policy [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4b86ceed91444f8d99b5ca8f859c837f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c151ec284e644559b18bf267868e96ef', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59534) authorize /opt/stack/nova/nova/policy.py:203}} [ 616.503587] env[59534]: DEBUG oslo_concurrency.lockutils [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] Acquiring lock "4a6f2391-5c17-452d-baf2-c4c62a2dde72" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 616.503587] env[59534]: DEBUG oslo_concurrency.lockutils [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] Lock "4a6f2391-5c17-452d-baf2-c4c62a2dde72" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 616.518086] env[59534]: DEBUG nova.compute.manager [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] Starting instance... {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 616.596582] env[59534]: DEBUG oslo_concurrency.lockutils [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 616.596878] env[59534]: DEBUG oslo_concurrency.lockutils [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 616.598361] env[59534]: INFO nova.compute.claims [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 616.735991] env[59534]: DEBUG nova.network.neutron [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] Successfully created port: 01c3c143-df9f-46cc-89b1-8070df187a5d {{(pid=59534) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 616.804108] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29a8c132-df15-40a1-a577-be748ff7ed58 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 616.812826] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d767d8a1-b31c-421d-9593-aa7b36e99d30 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 616.852895] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8ef9f8f-b777-46ca-86b2-b9611c672a04 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 616.866809] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09b89551-383e-4ecd-8a2d-a3a010bec7d6 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 616.882695] env[59534]: DEBUG nova.compute.provider_tree [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 616.893224] env[59534]: DEBUG nova.scheduler.client.report [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 616.912902] env[59534]: DEBUG oslo_concurrency.lockutils [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.316s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 616.913567] env[59534]: DEBUG nova.compute.manager [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] Start building networks asynchronously for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 616.966891] env[59534]: DEBUG nova.compute.utils [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] Using /dev/sd instead of None {{(pid=59534) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 616.968567] env[59534]: DEBUG nova.compute.manager [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] Allocating IP information in the background. {{(pid=59534) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 616.968727] env[59534]: DEBUG nova.network.neutron [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] allocate_for_instance() {{(pid=59534) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 616.978367] env[59534]: DEBUG nova.compute.manager [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] Start building block device mappings for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 617.071481] env[59534]: DEBUG nova.compute.manager [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] Start spawning the instance on the hypervisor. {{(pid=59534) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 617.096986] env[59534]: DEBUG nova.virt.hardware [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T19:44:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T19:44:03Z,direct_url=,disk_format='vmdk',id=ca8542a2-3ba7-4624-b2be-cd49a340ac21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='31a329da9a7b4c98a2734a8492bf7dec',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T19:44:04Z,virtual_size=,visibility=), allow threads: False {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 617.097106] env[59534]: DEBUG nova.virt.hardware [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] Flavor limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 617.097187] env[59534]: DEBUG nova.virt.hardware [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] Image limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 617.097362] env[59534]: DEBUG nova.virt.hardware [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] Flavor pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 617.097503] env[59534]: DEBUG nova.virt.hardware [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] Image pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 617.097647] env[59534]: DEBUG nova.virt.hardware [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 617.097841] env[59534]: DEBUG nova.virt.hardware [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 617.097976] env[59534]: DEBUG nova.virt.hardware [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 617.098592] env[59534]: DEBUG nova.virt.hardware [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] Got 1 possible topologies {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 617.098592] env[59534]: DEBUG nova.virt.hardware [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 617.098701] env[59534]: DEBUG nova.virt.hardware [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 617.099598] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fbaa4856-a72f-43f6-a06f-3218dfb6afc9 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 617.112512] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b4d3f05-fbf8-41df-b4ac-6922331aadaa {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 617.531211] env[59534]: DEBUG nova.policy [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5ea5eb457d974a029c0a27aead90c18a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '138039fed4eb4e17a4d7c396fc58e157', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59534) authorize /opt/stack/nova/nova/policy.py:203}} [ 618.501799] env[59534]: DEBUG oslo_concurrency.lockutils [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] Acquiring lock "533ba77f-7191-4af2-b3c7-204efa001ffa" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 618.502066] env[59534]: DEBUG oslo_concurrency.lockutils [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] Lock "533ba77f-7191-4af2-b3c7-204efa001ffa" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 618.514279] env[59534]: DEBUG nova.compute.manager [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] Starting instance... {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 618.573139] env[59534]: DEBUG oslo_concurrency.lockutils [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 618.573391] env[59534]: DEBUG oslo_concurrency.lockutils [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 618.577344] env[59534]: INFO nova.compute.claims [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 618.695835] env[59534]: DEBUG oslo_service.periodic_task [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59534) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 618.696178] env[59534]: DEBUG oslo_service.periodic_task [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59534) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 618.696424] env[59534]: DEBUG nova.compute.manager [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Starting heal instance info cache {{(pid=59534) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 618.696584] env[59534]: DEBUG nova.compute.manager [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Rebuilding the list of instances to heal {{(pid=59534) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 618.715522] env[59534]: DEBUG nova.compute.manager [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] Skipping network cache update for instance because it is Building. {{(pid=59534) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 618.715639] env[59534]: DEBUG nova.compute.manager [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] [instance: f42f768c-09ad-44b6-b294-681f392bb483] Skipping network cache update for instance because it is Building. {{(pid=59534) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 618.715749] env[59534]: DEBUG nova.compute.manager [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] [instance: c4837d87-be47-4a47-9703-f73eeddb0053] Skipping network cache update for instance because it is Building. {{(pid=59534) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 618.715867] env[59534]: DEBUG nova.compute.manager [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] Skipping network cache update for instance because it is Building. {{(pid=59534) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 618.715985] env[59534]: DEBUG nova.compute.manager [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] Skipping network cache update for instance because it is Building. {{(pid=59534) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 618.716658] env[59534]: DEBUG nova.compute.manager [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] Skipping network cache update for instance because it is Building. {{(pid=59534) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 618.716658] env[59534]: DEBUG nova.compute.manager [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] Skipping network cache update for instance because it is Building. {{(pid=59534) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 618.716759] env[59534]: DEBUG nova.compute.manager [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Didn't find any instances for network info cache update. {{(pid=59534) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 618.717913] env[59534]: DEBUG oslo_service.periodic_task [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59534) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 618.718120] env[59534]: DEBUG oslo_service.periodic_task [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59534) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 618.719046] env[59534]: DEBUG oslo_service.periodic_task [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59534) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 618.719046] env[59534]: DEBUG oslo_service.periodic_task [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59534) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 618.719046] env[59534]: DEBUG oslo_service.periodic_task [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59534) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 618.719046] env[59534]: DEBUG oslo_service.periodic_task [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59534) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 618.719046] env[59534]: DEBUG nova.compute.manager [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59534) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 618.719267] env[59534]: DEBUG oslo_service.periodic_task [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59534) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 618.734683] env[59534]: DEBUG oslo_concurrency.lockutils [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 618.773385] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a768aee-1044-4444-996d-41a5a27d6f62 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.783975] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b88f929-cc59-4fa2-9108-fca94ed9725f {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.818604] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a58e4def-0d6d-49e0-a416-94cb609b4750 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.829307] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f50d5fd2-2e3b-46be-95c6-30725a146e49 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.844289] env[59534]: DEBUG nova.compute.provider_tree [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 618.860802] env[59534]: DEBUG nova.scheduler.client.report [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 618.882691] env[59534]: DEBUG oslo_concurrency.lockutils [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.308s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 618.883371] env[59534]: DEBUG nova.compute.manager [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] Start building networks asynchronously for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 618.885882] env[59534]: DEBUG oslo_concurrency.lockutils [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.151s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 618.885960] env[59534]: DEBUG oslo_concurrency.lockutils [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 618.888488] env[59534]: DEBUG nova.compute.resource_tracker [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59534) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 618.889578] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b727878-592b-447d-8fe9-e55d929e60e2 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.902440] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32dd0eb5-18ff-494c-95c0-83df857ceaac {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.916740] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80b5652c-a57d-48b4-9023-9fa4ee41f819 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.924395] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a3ada6f-1c94-4745-b14e-80502ef6e8c3 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.929056] env[59534]: DEBUG nova.compute.utils [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] Using /dev/sd instead of None {{(pid=59534) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 618.930533] env[59534]: DEBUG nova.compute.manager [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] Allocating IP information in the background. {{(pid=59534) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 618.931483] env[59534]: DEBUG nova.network.neutron [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] allocate_for_instance() {{(pid=59534) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 618.966935] env[59534]: DEBUG nova.compute.resource_tracker [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181500MB free_disk=136GB free_vcpus=48 pci_devices=None {{(pid=59534) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 618.967112] env[59534]: DEBUG oslo_concurrency.lockutils [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 618.967303] env[59534]: DEBUG oslo_concurrency.lockutils [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 618.969522] env[59534]: DEBUG nova.compute.manager [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] Start building block device mappings for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 619.050500] env[59534]: DEBUG nova.compute.manager [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] Start spawning the instance on the hypervisor. {{(pid=59534) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 619.079639] env[59534]: DEBUG nova.virt.hardware [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T19:44:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T19:44:03Z,direct_url=,disk_format='vmdk',id=ca8542a2-3ba7-4624-b2be-cd49a340ac21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='31a329da9a7b4c98a2734a8492bf7dec',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T19:44:04Z,virtual_size=,visibility=), allow threads: False {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 619.080024] env[59534]: DEBUG nova.virt.hardware [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] Flavor limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 619.080101] env[59534]: DEBUG nova.virt.hardware [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] Image limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 619.080354] env[59534]: DEBUG nova.virt.hardware [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] Flavor pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 619.080805] env[59534]: DEBUG nova.virt.hardware [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] Image pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 619.080805] env[59534]: DEBUG nova.virt.hardware [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 619.081090] env[59534]: DEBUG nova.virt.hardware [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 619.081472] env[59534]: DEBUG nova.virt.hardware [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 619.081933] env[59534]: DEBUG nova.virt.hardware [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] Got 1 possible topologies {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 619.082126] env[59534]: DEBUG nova.virt.hardware [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 619.082315] env[59534]: DEBUG nova.virt.hardware [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 619.083442] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f8c9596-d810-41a3-9fdd-783f6052f867 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.089973] env[59534]: DEBUG nova.compute.resource_tracker [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Instance 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59534) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 619.090146] env[59534]: DEBUG nova.compute.resource_tracker [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Instance f42f768c-09ad-44b6-b294-681f392bb483 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59534) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 619.090274] env[59534]: DEBUG nova.compute.resource_tracker [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Instance c4837d87-be47-4a47-9703-f73eeddb0053 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59534) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 619.090396] env[59534]: DEBUG nova.compute.resource_tracker [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Instance 0f052eb8-59d7-4dcb-9d2f-bc7740424a66 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59534) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 619.090515] env[59534]: DEBUG nova.compute.resource_tracker [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Instance 734b416a-659c-451b-82ea-0b8a2796fffd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59534) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 619.090631] env[59534]: DEBUG nova.compute.resource_tracker [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Instance 4a6f2391-5c17-452d-baf2-c4c62a2dde72 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59534) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 619.090895] env[59534]: DEBUG nova.compute.resource_tracker [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Instance 533ba77f-7191-4af2-b3c7-204efa001ffa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59534) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 619.090969] env[59534]: DEBUG nova.compute.resource_tracker [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Total usable vcpus: 48, total allocated vcpus: 7 {{(pid=59534) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 619.091073] env[59534]: DEBUG nova.compute.resource_tracker [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1408MB phys_disk=200GB used_disk=7GB total_vcpus=48 used_vcpus=7 pci_stats=[] {{(pid=59534) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 619.102554] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8068ef36-9a97-4175-9daa-dbc0cc7845a3 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.398660] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-058f1968-a05f-4ef7-8f3f-72a427fa4c1c {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.413872] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28b4abb8-2ad5-489d-a78d-927abf38a794 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.445374] env[59534]: DEBUG nova.policy [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '13832e8d1fac4a96a5aacca564f86ee0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cc065469898443b8ab77b8a7b07f39b2', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59534) authorize /opt/stack/nova/nova/policy.py:203}} [ 619.447407] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73ae5e34-c520-4204-870e-a3c85ccd48f2 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.456343] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da9e80d7-057b-410b-bdac-36f711ccafda {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.468871] env[59534]: DEBUG nova.compute.provider_tree [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 619.482804] env[59534]: DEBUG nova.scheduler.client.report [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 619.500585] env[59534]: DEBUG nova.compute.resource_tracker [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59534) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 619.500772] env[59534]: DEBUG oslo_concurrency.lockutils [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.533s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 620.398214] env[59534]: DEBUG nova.network.neutron [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] Successfully created port: 0cc5d8d9-60b1-46fb-a63d-d39eb9741a8a {{(pid=59534) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 620.657697] env[59534]: DEBUG oslo_concurrency.lockutils [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] Acquiring lock "e105adff-476c-46b0-b795-daf44b69ef3a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 620.657933] env[59534]: DEBUG oslo_concurrency.lockutils [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] Lock "e105adff-476c-46b0-b795-daf44b69ef3a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 620.678437] env[59534]: DEBUG nova.compute.manager [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] [instance: e105adff-476c-46b0-b795-daf44b69ef3a] Starting instance... {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 620.777848] env[59534]: DEBUG oslo_concurrency.lockutils [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 620.778089] env[59534]: DEBUG oslo_concurrency.lockutils [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 620.780127] env[59534]: INFO nova.compute.claims [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] [instance: e105adff-476c-46b0-b795-daf44b69ef3a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 620.950226] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c27a29a-c70c-4b22-a164-cc599207eb7e {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.958128] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88a3b896-7da9-409b-b4fe-f27d47c6796f {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.989807] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-698deb33-3af0-4c49-9f30-a4ae72137071 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.997706] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5f7fdf0-0336-43da-9072-2e32a5184163 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 621.013205] env[59534]: DEBUG nova.compute.provider_tree [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 621.024346] env[59534]: DEBUG nova.scheduler.client.report [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 621.039664] env[59534]: DEBUG oslo_concurrency.lockutils [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.261s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 621.040169] env[59534]: DEBUG nova.compute.manager [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] [instance: e105adff-476c-46b0-b795-daf44b69ef3a] Start building networks asynchronously for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 621.087056] env[59534]: DEBUG nova.compute.utils [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] Using /dev/sd instead of None {{(pid=59534) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 621.089143] env[59534]: DEBUG nova.compute.manager [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] [instance: e105adff-476c-46b0-b795-daf44b69ef3a] Allocating IP information in the background. {{(pid=59534) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 621.089143] env[59534]: DEBUG nova.network.neutron [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] [instance: e105adff-476c-46b0-b795-daf44b69ef3a] allocate_for_instance() {{(pid=59534) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 621.099905] env[59534]: DEBUG nova.compute.manager [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] [instance: e105adff-476c-46b0-b795-daf44b69ef3a] Start building block device mappings for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 621.197364] env[59534]: DEBUG nova.compute.manager [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] [instance: e105adff-476c-46b0-b795-daf44b69ef3a] Start spawning the instance on the hypervisor. {{(pid=59534) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 621.230220] env[59534]: DEBUG nova.virt.hardware [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T19:44:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T19:44:03Z,direct_url=,disk_format='vmdk',id=ca8542a2-3ba7-4624-b2be-cd49a340ac21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='31a329da9a7b4c98a2734a8492bf7dec',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T19:44:04Z,virtual_size=,visibility=), allow threads: False {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 621.234612] env[59534]: DEBUG nova.virt.hardware [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] Flavor limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 621.234612] env[59534]: DEBUG nova.virt.hardware [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] Image limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 621.234612] env[59534]: DEBUG nova.virt.hardware [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] Flavor pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 621.234612] env[59534]: DEBUG nova.virt.hardware [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] Image pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 621.234612] env[59534]: DEBUG nova.virt.hardware [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 621.234907] env[59534]: DEBUG nova.virt.hardware [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 621.234986] env[59534]: DEBUG nova.virt.hardware [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 621.235352] env[59534]: DEBUG nova.virt.hardware [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] Got 1 possible topologies {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 621.235352] env[59534]: DEBUG nova.virt.hardware [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 621.235482] env[59534]: DEBUG nova.virt.hardware [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 621.236870] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-adc1e137-0e02-44d4-86e1-ddbce4840d93 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 621.247461] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d7e2ee8-7768-40e5-8322-7527cc03809f {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 621.589575] env[59534]: DEBUG nova.policy [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c0dbe8eae212437cacb8874747d35ef4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '69230e047a304d339209621a4da6d72e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59534) authorize /opt/stack/nova/nova/policy.py:203}} [ 622.026320] env[59534]: DEBUG nova.network.neutron [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] Successfully created port: 88088e2d-f713-4fc1-93a7-61174147040f {{(pid=59534) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 622.235676] env[59534]: DEBUG oslo_concurrency.lockutils [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] Acquiring lock "62cecc37-ce9f-42f6-8be2-efa724e94916" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 622.235907] env[59534]: DEBUG oslo_concurrency.lockutils [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] Lock "62cecc37-ce9f-42f6-8be2-efa724e94916" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 622.251749] env[59534]: DEBUG nova.compute.manager [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] Starting instance... {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 622.300329] env[59534]: DEBUG oslo_concurrency.lockutils [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 622.300875] env[59534]: DEBUG oslo_concurrency.lockutils [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 622.302043] env[59534]: INFO nova.compute.claims [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 622.482195] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d91eecd-7c95-47ad-8c5f-21d9c994de9b {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 622.490179] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef5fe137-5f38-4983-98cc-760600ed1db3 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 622.521565] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3782e2ec-20de-4d25-b3c5-35e8c253ed56 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 622.529175] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb44d6a0-9a20-421e-8b79-c03e48d27def {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 622.542537] env[59534]: DEBUG nova.compute.provider_tree [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 622.554468] env[59534]: DEBUG nova.scheduler.client.report [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 622.573376] env[59534]: DEBUG oslo_concurrency.lockutils [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.273s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 622.573890] env[59534]: DEBUG nova.compute.manager [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] Start building networks asynchronously for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 622.611868] env[59534]: DEBUG nova.compute.utils [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] Using /dev/sd instead of None {{(pid=59534) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 622.617033] env[59534]: DEBUG nova.compute.manager [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] Allocating IP information in the background. {{(pid=59534) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 622.617033] env[59534]: DEBUG nova.network.neutron [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] allocate_for_instance() {{(pid=59534) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 622.623855] env[59534]: DEBUG nova.compute.manager [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] Start building block device mappings for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 622.696077] env[59534]: DEBUG nova.compute.manager [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] Start spawning the instance on the hypervisor. {{(pid=59534) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 622.722202] env[59534]: DEBUG nova.virt.hardware [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T19:44:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T19:44:03Z,direct_url=,disk_format='vmdk',id=ca8542a2-3ba7-4624-b2be-cd49a340ac21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='31a329da9a7b4c98a2734a8492bf7dec',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T19:44:04Z,virtual_size=,visibility=), allow threads: False {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 622.722443] env[59534]: DEBUG nova.virt.hardware [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] Flavor limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 622.722594] env[59534]: DEBUG nova.virt.hardware [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] Image limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 622.722819] env[59534]: DEBUG nova.virt.hardware [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] Flavor pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 622.722970] env[59534]: DEBUG nova.virt.hardware [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] Image pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 622.723126] env[59534]: DEBUG nova.virt.hardware [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 622.723326] env[59534]: DEBUG nova.virt.hardware [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 622.723477] env[59534]: DEBUG nova.virt.hardware [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 622.723667] env[59534]: DEBUG nova.virt.hardware [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] Got 1 possible topologies {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 622.723826] env[59534]: DEBUG nova.virt.hardware [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 622.723996] env[59534]: DEBUG nova.virt.hardware [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 622.724859] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dac5151f-5384-461c-ae8a-1ed4ec9fbe43 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 622.733285] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05283933-7c25-4315-a1af-6e6246a8d7fe {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 623.027945] env[59534]: DEBUG nova.policy [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'be577712952b47759b21d43fc98cba33', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '57aecfe513b64d13971574d0920a47e7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59534) authorize /opt/stack/nova/nova/policy.py:203}} [ 623.877245] env[59534]: DEBUG oslo_concurrency.lockutils [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] Acquiring lock "89c6b8db-b87a-4a05-9fab-72eff91e4fe3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 623.878455] env[59534]: DEBUG oslo_concurrency.lockutils [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] Lock "89c6b8db-b87a-4a05-9fab-72eff91e4fe3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 623.890523] env[59534]: DEBUG nova.compute.manager [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] Starting instance... {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 623.958948] env[59534]: DEBUG oslo_concurrency.lockutils [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 623.959197] env[59534]: DEBUG oslo_concurrency.lockutils [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 623.962046] env[59534]: INFO nova.compute.claims [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 624.202025] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1df62308-6419-4fce-93b2-b5712be8d1d2 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 624.209092] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62a1c2f0-75f0-401d-95f6-6e0911c14f5c {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 624.243828] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c938c85-9195-42f3-b3c5-15e6841696e9 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 624.251640] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71dd2bb2-043e-475b-bb1e-17cc811dd60f {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 624.265261] env[59534]: DEBUG nova.compute.provider_tree [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 624.274728] env[59534]: DEBUG nova.scheduler.client.report [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 624.295253] env[59534]: DEBUG oslo_concurrency.lockutils [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.334s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 624.295253] env[59534]: DEBUG nova.compute.manager [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] Start building networks asynchronously for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 624.344122] env[59534]: DEBUG nova.compute.utils [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] Using /dev/sd instead of None {{(pid=59534) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 624.347713] env[59534]: DEBUG nova.compute.manager [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] Allocating IP information in the background. {{(pid=59534) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 624.347713] env[59534]: DEBUG nova.network.neutron [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] allocate_for_instance() {{(pid=59534) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 624.355760] env[59534]: DEBUG nova.compute.manager [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] Start building block device mappings for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 624.434250] env[59534]: DEBUG nova.compute.manager [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] Start spawning the instance on the hypervisor. {{(pid=59534) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 624.467423] env[59534]: DEBUG nova.virt.hardware [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T19:44:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T19:44:03Z,direct_url=,disk_format='vmdk',id=ca8542a2-3ba7-4624-b2be-cd49a340ac21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='31a329da9a7b4c98a2734a8492bf7dec',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T19:44:04Z,virtual_size=,visibility=), allow threads: False {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 624.467711] env[59534]: DEBUG nova.virt.hardware [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] Flavor limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 624.467857] env[59534]: DEBUG nova.virt.hardware [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] Image limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 624.468119] env[59534]: DEBUG nova.virt.hardware [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] Flavor pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 624.468274] env[59534]: DEBUG nova.virt.hardware [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] Image pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 624.468412] env[59534]: DEBUG nova.virt.hardware [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 624.468613] env[59534]: DEBUG nova.virt.hardware [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 624.468757] env[59534]: DEBUG nova.virt.hardware [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 624.472021] env[59534]: DEBUG nova.virt.hardware [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] Got 1 possible topologies {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 624.472021] env[59534]: DEBUG nova.virt.hardware [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 624.472021] env[59534]: DEBUG nova.virt.hardware [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 624.472021] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80a1d7c4-0d39-4ff0-862c-fcd8cffd5f6e {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 624.481230] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68d31b1f-9f76-4461-b4a1-2cb1f5facb12 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 624.823741] env[59534]: DEBUG nova.network.neutron [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] [instance: e105adff-476c-46b0-b795-daf44b69ef3a] Successfully created port: bef7ea21-2ba0-4ccb-a8b4-4fc5da944d3d {{(pid=59534) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 624.903758] env[59534]: DEBUG nova.policy [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3e8d8c5a5c6e43339c62b1ea77a6e646', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '88bd5475f9aa4c0db831466e666607ad', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59534) authorize /opt/stack/nova/nova/policy.py:203}} [ 625.329317] env[59534]: ERROR nova.compute.manager [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 01c27392-b745-429b-b2c0-4d8df0c419a8, please check neutron logs for more information. [ 625.329317] env[59534]: ERROR nova.compute.manager Traceback (most recent call last): [ 625.329317] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 625.329317] env[59534]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 625.329317] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 625.329317] env[59534]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 625.329317] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 625.329317] env[59534]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 625.329317] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 625.329317] env[59534]: ERROR nova.compute.manager self.force_reraise() [ 625.329317] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 625.329317] env[59534]: ERROR nova.compute.manager raise self.value [ 625.329317] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 625.329317] env[59534]: ERROR nova.compute.manager updated_port = self._update_port( [ 625.329317] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 625.329317] env[59534]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 625.330637] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 625.330637] env[59534]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 625.330637] env[59534]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 01c27392-b745-429b-b2c0-4d8df0c419a8, please check neutron logs for more information. [ 625.330637] env[59534]: ERROR nova.compute.manager [ 625.330637] env[59534]: Traceback (most recent call last): [ 625.330637] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 625.330637] env[59534]: listener.cb(fileno) [ 625.330637] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 625.330637] env[59534]: result = function(*args, **kwargs) [ 625.330637] env[59534]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 625.330637] env[59534]: return func(*args, **kwargs) [ 625.330637] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 625.330637] env[59534]: raise e [ 625.330637] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 625.330637] env[59534]: nwinfo = self.network_api.allocate_for_instance( [ 625.330637] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 625.330637] env[59534]: created_port_ids = self._update_ports_for_instance( [ 625.330637] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 625.330637] env[59534]: with excutils.save_and_reraise_exception(): [ 625.330637] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 625.330637] env[59534]: self.force_reraise() [ 625.330637] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 625.330637] env[59534]: raise self.value [ 625.330637] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 625.330637] env[59534]: updated_port = self._update_port( [ 625.330637] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 625.330637] env[59534]: _ensure_no_port_binding_failure(port) [ 625.330637] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 625.330637] env[59534]: raise exception.PortBindingFailed(port_id=port['id']) [ 625.331439] env[59534]: nova.exception.PortBindingFailed: Binding failed for port 01c27392-b745-429b-b2c0-4d8df0c419a8, please check neutron logs for more information. [ 625.331439] env[59534]: Removing descriptor: 12 [ 625.331439] env[59534]: ERROR nova.compute.manager [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] [instance: f42f768c-09ad-44b6-b294-681f392bb483] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 01c27392-b745-429b-b2c0-4d8df0c419a8, please check neutron logs for more information. [ 625.331439] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] Traceback (most recent call last): [ 625.331439] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 625.331439] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] yield resources [ 625.331439] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 625.331439] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] self.driver.spawn(context, instance, image_meta, [ 625.331439] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 625.331439] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] self._vmops.spawn(context, instance, image_meta, injected_files, [ 625.331439] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 625.331439] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] vm_ref = self.build_virtual_machine(instance, [ 625.331742] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 625.331742] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] vif_infos = vmwarevif.get_vif_info(self._session, [ 625.331742] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 625.331742] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] for vif in network_info: [ 625.331742] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 625.331742] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] return self._sync_wrapper(fn, *args, **kwargs) [ 625.331742] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 625.331742] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] self.wait() [ 625.331742] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 625.331742] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] self[:] = self._gt.wait() [ 625.331742] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 625.331742] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] return self._exit_event.wait() [ 625.331742] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 625.331742] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] result = hub.switch() [ 625.332115] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 625.332115] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] return self.greenlet.switch() [ 625.332115] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 625.332115] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] result = function(*args, **kwargs) [ 625.332115] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 625.332115] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] return func(*args, **kwargs) [ 625.332115] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 625.332115] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] raise e [ 625.332115] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 625.332115] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] nwinfo = self.network_api.allocate_for_instance( [ 625.332115] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 625.332115] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] created_port_ids = self._update_ports_for_instance( [ 625.332115] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 625.332431] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] with excutils.save_and_reraise_exception(): [ 625.332431] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 625.332431] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] self.force_reraise() [ 625.332431] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 625.332431] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] raise self.value [ 625.332431] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 625.332431] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] updated_port = self._update_port( [ 625.332431] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 625.332431] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] _ensure_no_port_binding_failure(port) [ 625.332431] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 625.332431] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] raise exception.PortBindingFailed(port_id=port['id']) [ 625.332431] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] nova.exception.PortBindingFailed: Binding failed for port 01c27392-b745-429b-b2c0-4d8df0c419a8, please check neutron logs for more information. [ 625.332431] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] [ 625.336300] env[59534]: INFO nova.compute.manager [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] [instance: f42f768c-09ad-44b6-b294-681f392bb483] Terminating instance [ 625.338723] env[59534]: DEBUG oslo_concurrency.lockutils [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] Acquiring lock "refresh_cache-f42f768c-09ad-44b6-b294-681f392bb483" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 625.338723] env[59534]: DEBUG oslo_concurrency.lockutils [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] Acquired lock "refresh_cache-f42f768c-09ad-44b6-b294-681f392bb483" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 625.338723] env[59534]: DEBUG nova.network.neutron [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] [instance: f42f768c-09ad-44b6-b294-681f392bb483] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 625.344871] env[59534]: ERROR nova.compute.manager [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 06045837-635b-4a38-bd57-9b1d16b054c1, please check neutron logs for more information. [ 625.344871] env[59534]: ERROR nova.compute.manager Traceback (most recent call last): [ 625.344871] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 625.344871] env[59534]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 625.344871] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 625.344871] env[59534]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 625.344871] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 625.344871] env[59534]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 625.344871] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 625.344871] env[59534]: ERROR nova.compute.manager self.force_reraise() [ 625.344871] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 625.344871] env[59534]: ERROR nova.compute.manager raise self.value [ 625.344871] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 625.344871] env[59534]: ERROR nova.compute.manager updated_port = self._update_port( [ 625.344871] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 625.344871] env[59534]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 625.345306] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 625.345306] env[59534]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 625.345306] env[59534]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 06045837-635b-4a38-bd57-9b1d16b054c1, please check neutron logs for more information. [ 625.345306] env[59534]: ERROR nova.compute.manager [ 625.345306] env[59534]: Traceback (most recent call last): [ 625.345306] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 625.345306] env[59534]: listener.cb(fileno) [ 625.345306] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 625.345306] env[59534]: result = function(*args, **kwargs) [ 625.345306] env[59534]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 625.345306] env[59534]: return func(*args, **kwargs) [ 625.345306] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 625.345306] env[59534]: raise e [ 625.345306] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 625.345306] env[59534]: nwinfo = self.network_api.allocate_for_instance( [ 625.345306] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 625.345306] env[59534]: created_port_ids = self._update_ports_for_instance( [ 625.345306] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 625.345306] env[59534]: with excutils.save_and_reraise_exception(): [ 625.345306] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 625.345306] env[59534]: self.force_reraise() [ 625.345306] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 625.345306] env[59534]: raise self.value [ 625.345306] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 625.345306] env[59534]: updated_port = self._update_port( [ 625.345306] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 625.345306] env[59534]: _ensure_no_port_binding_failure(port) [ 625.345306] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 625.345306] env[59534]: raise exception.PortBindingFailed(port_id=port['id']) [ 625.345996] env[59534]: nova.exception.PortBindingFailed: Binding failed for port 06045837-635b-4a38-bd57-9b1d16b054c1, please check neutron logs for more information. [ 625.345996] env[59534]: Removing descriptor: 13 [ 625.345996] env[59534]: ERROR nova.compute.manager [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 06045837-635b-4a38-bd57-9b1d16b054c1, please check neutron logs for more information. [ 625.345996] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] Traceback (most recent call last): [ 625.345996] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 625.345996] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] yield resources [ 625.345996] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 625.345996] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] self.driver.spawn(context, instance, image_meta, [ 625.345996] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 625.345996] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 625.345996] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 625.345996] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] vm_ref = self.build_virtual_machine(instance, [ 625.346336] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 625.346336] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] vif_infos = vmwarevif.get_vif_info(self._session, [ 625.346336] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 625.346336] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] for vif in network_info: [ 625.346336] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 625.346336] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] return self._sync_wrapper(fn, *args, **kwargs) [ 625.346336] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 625.346336] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] self.wait() [ 625.346336] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 625.346336] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] self[:] = self._gt.wait() [ 625.346336] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 625.346336] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] return self._exit_event.wait() [ 625.346336] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 625.346336] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] result = hub.switch() [ 625.346700] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 625.346700] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] return self.greenlet.switch() [ 625.346700] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 625.346700] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] result = function(*args, **kwargs) [ 625.346700] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 625.346700] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] return func(*args, **kwargs) [ 625.346700] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 625.346700] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] raise e [ 625.346700] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 625.346700] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] nwinfo = self.network_api.allocate_for_instance( [ 625.346700] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 625.346700] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] created_port_ids = self._update_ports_for_instance( [ 625.346700] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 625.347052] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] with excutils.save_and_reraise_exception(): [ 625.347052] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 625.347052] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] self.force_reraise() [ 625.347052] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 625.347052] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] raise self.value [ 625.347052] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 625.347052] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] updated_port = self._update_port( [ 625.347052] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 625.347052] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] _ensure_no_port_binding_failure(port) [ 625.347052] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 625.347052] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] raise exception.PortBindingFailed(port_id=port['id']) [ 625.347052] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] nova.exception.PortBindingFailed: Binding failed for port 06045837-635b-4a38-bd57-9b1d16b054c1, please check neutron logs for more information. [ 625.347052] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] [ 625.347395] env[59534]: INFO nova.compute.manager [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] Terminating instance [ 625.350330] env[59534]: DEBUG oslo_concurrency.lockutils [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] Acquiring lock "refresh_cache-8b13a5ee-6538-4eb9-9ae6-b3a960a958f2" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 625.350417] env[59534]: DEBUG oslo_concurrency.lockutils [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] Acquired lock "refresh_cache-8b13a5ee-6538-4eb9-9ae6-b3a960a958f2" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 625.350557] env[59534]: DEBUG nova.network.neutron [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 625.632621] env[59534]: DEBUG nova.network.neutron [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] [instance: f42f768c-09ad-44b6-b294-681f392bb483] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 625.638838] env[59534]: DEBUG nova.network.neutron [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 626.127263] env[59534]: DEBUG nova.network.neutron [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] [instance: f42f768c-09ad-44b6-b294-681f392bb483] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 626.139341] env[59534]: DEBUG oslo_concurrency.lockutils [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] Releasing lock "refresh_cache-f42f768c-09ad-44b6-b294-681f392bb483" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 626.139728] env[59534]: DEBUG nova.compute.manager [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] [instance: f42f768c-09ad-44b6-b294-681f392bb483] Start destroying the instance on the hypervisor. {{(pid=59534) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 626.139904] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] [instance: f42f768c-09ad-44b6-b294-681f392bb483] Destroying instance {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 626.140420] env[59534]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f60fb63c-f488-44ee-98d0-d2d5e4d36905 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 626.150988] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3059a3c-0861-4da9-9445-ff73c19419c5 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 626.163838] env[59534]: DEBUG nova.network.neutron [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 626.165722] env[59534]: ERROR nova.compute.manager [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port b6831d1f-22a2-4cdf-84ad-ea32b3046dce, please check neutron logs for more information. [ 626.165722] env[59534]: ERROR nova.compute.manager Traceback (most recent call last): [ 626.165722] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 626.165722] env[59534]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 626.165722] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 626.165722] env[59534]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 626.165722] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 626.165722] env[59534]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 626.165722] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 626.165722] env[59534]: ERROR nova.compute.manager self.force_reraise() [ 626.165722] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 626.165722] env[59534]: ERROR nova.compute.manager raise self.value [ 626.165722] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 626.165722] env[59534]: ERROR nova.compute.manager updated_port = self._update_port( [ 626.165722] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 626.165722] env[59534]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 626.166234] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 626.166234] env[59534]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 626.166234] env[59534]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port b6831d1f-22a2-4cdf-84ad-ea32b3046dce, please check neutron logs for more information. [ 626.166234] env[59534]: ERROR nova.compute.manager [ 626.166234] env[59534]: Traceback (most recent call last): [ 626.166234] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 626.166234] env[59534]: listener.cb(fileno) [ 626.166234] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 626.166234] env[59534]: result = function(*args, **kwargs) [ 626.166234] env[59534]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 626.166234] env[59534]: return func(*args, **kwargs) [ 626.166234] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 626.166234] env[59534]: raise e [ 626.166234] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 626.166234] env[59534]: nwinfo = self.network_api.allocate_for_instance( [ 626.166234] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 626.166234] env[59534]: created_port_ids = self._update_ports_for_instance( [ 626.166234] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 626.166234] env[59534]: with excutils.save_and_reraise_exception(): [ 626.166234] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 626.166234] env[59534]: self.force_reraise() [ 626.166234] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 626.166234] env[59534]: raise self.value [ 626.166234] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 626.166234] env[59534]: updated_port = self._update_port( [ 626.166234] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 626.166234] env[59534]: _ensure_no_port_binding_failure(port) [ 626.166234] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 626.166234] env[59534]: raise exception.PortBindingFailed(port_id=port['id']) [ 626.167097] env[59534]: nova.exception.PortBindingFailed: Binding failed for port b6831d1f-22a2-4cdf-84ad-ea32b3046dce, please check neutron logs for more information. [ 626.167097] env[59534]: Removing descriptor: 14 [ 626.167167] env[59534]: ERROR nova.compute.manager [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] [instance: c4837d87-be47-4a47-9703-f73eeddb0053] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port b6831d1f-22a2-4cdf-84ad-ea32b3046dce, please check neutron logs for more information. [ 626.167167] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] Traceback (most recent call last): [ 626.167167] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 626.167167] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] yield resources [ 626.167167] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 626.167167] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] self.driver.spawn(context, instance, image_meta, [ 626.167167] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 626.167167] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] self._vmops.spawn(context, instance, image_meta, injected_files, [ 626.167167] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 626.167167] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] vm_ref = self.build_virtual_machine(instance, [ 626.167167] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 626.167512] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] vif_infos = vmwarevif.get_vif_info(self._session, [ 626.167512] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 626.167512] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] for vif in network_info: [ 626.167512] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 626.167512] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] return self._sync_wrapper(fn, *args, **kwargs) [ 626.167512] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 626.167512] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] self.wait() [ 626.167512] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 626.167512] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] self[:] = self._gt.wait() [ 626.167512] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 626.167512] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] return self._exit_event.wait() [ 626.167512] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 626.167512] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] result = hub.switch() [ 626.167512] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 626.167927] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] return self.greenlet.switch() [ 626.167927] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 626.167927] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] result = function(*args, **kwargs) [ 626.167927] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 626.167927] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] return func(*args, **kwargs) [ 626.167927] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 626.167927] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] raise e [ 626.167927] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 626.167927] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] nwinfo = self.network_api.allocate_for_instance( [ 626.167927] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 626.167927] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] created_port_ids = self._update_ports_for_instance( [ 626.167927] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 626.167927] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] with excutils.save_and_reraise_exception(): [ 626.168331] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 626.168331] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] self.force_reraise() [ 626.168331] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 626.168331] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] raise self.value [ 626.168331] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 626.168331] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] updated_port = self._update_port( [ 626.168331] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 626.168331] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] _ensure_no_port_binding_failure(port) [ 626.168331] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 626.168331] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] raise exception.PortBindingFailed(port_id=port['id']) [ 626.168331] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] nova.exception.PortBindingFailed: Binding failed for port b6831d1f-22a2-4cdf-84ad-ea32b3046dce, please check neutron logs for more information. [ 626.168331] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] [ 626.168703] env[59534]: INFO nova.compute.manager [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] [instance: c4837d87-be47-4a47-9703-f73eeddb0053] Terminating instance [ 626.170957] env[59534]: DEBUG oslo_concurrency.lockutils [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] Acquiring lock "refresh_cache-c4837d87-be47-4a47-9703-f73eeddb0053" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 626.171160] env[59534]: DEBUG oslo_concurrency.lockutils [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] Acquired lock "refresh_cache-c4837d87-be47-4a47-9703-f73eeddb0053" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 626.171290] env[59534]: DEBUG nova.network.neutron [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] [instance: c4837d87-be47-4a47-9703-f73eeddb0053] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 626.176772] env[59534]: DEBUG oslo_concurrency.lockutils [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] Releasing lock "refresh_cache-8b13a5ee-6538-4eb9-9ae6-b3a960a958f2" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 626.177138] env[59534]: DEBUG nova.compute.manager [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] Start destroying the instance on the hypervisor. {{(pid=59534) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 626.177325] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] Destroying instance {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 626.181801] env[59534]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-4aea45e3-2ec1-4755-9289-967cab2fec7c {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 626.187328] env[59534]: WARNING nova.virt.vmwareapi.vmops [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] [instance: f42f768c-09ad-44b6-b294-681f392bb483] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance f42f768c-09ad-44b6-b294-681f392bb483 could not be found. [ 626.187328] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] [instance: f42f768c-09ad-44b6-b294-681f392bb483] Instance destroyed {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 626.187328] env[59534]: INFO nova.compute.manager [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] [instance: f42f768c-09ad-44b6-b294-681f392bb483] Took 0.04 seconds to destroy the instance on the hypervisor. [ 626.187328] env[59534]: DEBUG oslo.service.loopingcall [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59534) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 626.187328] env[59534]: DEBUG nova.compute.manager [-] [instance: f42f768c-09ad-44b6-b294-681f392bb483] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 626.187783] env[59534]: DEBUG nova.network.neutron [-] [instance: f42f768c-09ad-44b6-b294-681f392bb483] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 626.196696] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13c7ab82-33b0-40d1-8f7e-699a87a3194e {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 626.219730] env[59534]: WARNING nova.virt.vmwareapi.vmops [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2 could not be found. [ 626.219887] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] Instance destroyed {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 626.220075] env[59534]: INFO nova.compute.manager [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] Took 0.04 seconds to destroy the instance on the hypervisor. [ 626.220312] env[59534]: DEBUG oslo.service.loopingcall [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59534) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 626.220507] env[59534]: DEBUG nova.compute.manager [-] [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 626.224189] env[59534]: DEBUG nova.network.neutron [-] [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 626.298913] env[59534]: DEBUG nova.network.neutron [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] [instance: c4837d87-be47-4a47-9703-f73eeddb0053] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 626.301274] env[59534]: DEBUG nova.network.neutron [-] [instance: f42f768c-09ad-44b6-b294-681f392bb483] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 626.312178] env[59534]: DEBUG nova.network.neutron [-] [instance: f42f768c-09ad-44b6-b294-681f392bb483] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 626.322742] env[59534]: INFO nova.compute.manager [-] [instance: f42f768c-09ad-44b6-b294-681f392bb483] Took 0.14 seconds to deallocate network for instance. [ 626.327792] env[59534]: DEBUG nova.compute.claims [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] [instance: f42f768c-09ad-44b6-b294-681f392bb483] Aborting claim: {{(pid=59534) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 626.327894] env[59534]: DEBUG oslo_concurrency.lockutils [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 626.328113] env[59534]: DEBUG oslo_concurrency.lockutils [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 626.339902] env[59534]: DEBUG nova.network.neutron [-] [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 626.354992] env[59534]: DEBUG nova.network.neutron [-] [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 626.376844] env[59534]: INFO nova.compute.manager [-] [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] Took 0.16 seconds to deallocate network for instance. [ 626.379973] env[59534]: DEBUG nova.compute.claims [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] Aborting claim: {{(pid=59534) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 626.380308] env[59534]: DEBUG oslo_concurrency.lockutils [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 626.493377] env[59534]: ERROR nova.compute.manager [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 00059401-5bcb-4b4e-a1d4-788f34da675b, please check neutron logs for more information. [ 626.493377] env[59534]: ERROR nova.compute.manager Traceback (most recent call last): [ 626.493377] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 626.493377] env[59534]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 626.493377] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 626.493377] env[59534]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 626.493377] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 626.493377] env[59534]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 626.493377] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 626.493377] env[59534]: ERROR nova.compute.manager self.force_reraise() [ 626.493377] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 626.493377] env[59534]: ERROR nova.compute.manager raise self.value [ 626.493377] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 626.493377] env[59534]: ERROR nova.compute.manager updated_port = self._update_port( [ 626.493377] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 626.493377] env[59534]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 626.493817] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 626.493817] env[59534]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 626.493817] env[59534]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 00059401-5bcb-4b4e-a1d4-788f34da675b, please check neutron logs for more information. [ 626.493817] env[59534]: ERROR nova.compute.manager [ 626.493817] env[59534]: Traceback (most recent call last): [ 626.493817] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 626.493817] env[59534]: listener.cb(fileno) [ 626.493817] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 626.493817] env[59534]: result = function(*args, **kwargs) [ 626.493817] env[59534]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 626.493817] env[59534]: return func(*args, **kwargs) [ 626.493817] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 626.493817] env[59534]: raise e [ 626.493817] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 626.493817] env[59534]: nwinfo = self.network_api.allocate_for_instance( [ 626.493817] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 626.493817] env[59534]: created_port_ids = self._update_ports_for_instance( [ 626.493817] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 626.493817] env[59534]: with excutils.save_and_reraise_exception(): [ 626.493817] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 626.493817] env[59534]: self.force_reraise() [ 626.493817] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 626.493817] env[59534]: raise self.value [ 626.493817] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 626.493817] env[59534]: updated_port = self._update_port( [ 626.493817] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 626.493817] env[59534]: _ensure_no_port_binding_failure(port) [ 626.493817] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 626.493817] env[59534]: raise exception.PortBindingFailed(port_id=port['id']) [ 626.494519] env[59534]: nova.exception.PortBindingFailed: Binding failed for port 00059401-5bcb-4b4e-a1d4-788f34da675b, please check neutron logs for more information. [ 626.494519] env[59534]: Removing descriptor: 15 [ 626.494519] env[59534]: ERROR nova.compute.manager [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 00059401-5bcb-4b4e-a1d4-788f34da675b, please check neutron logs for more information. [ 626.494519] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] Traceback (most recent call last): [ 626.494519] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 626.494519] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] yield resources [ 626.494519] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 626.494519] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] self.driver.spawn(context, instance, image_meta, [ 626.494519] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 626.494519] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] self._vmops.spawn(context, instance, image_meta, injected_files, [ 626.494519] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 626.494519] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] vm_ref = self.build_virtual_machine(instance, [ 626.494833] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 626.494833] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] vif_infos = vmwarevif.get_vif_info(self._session, [ 626.494833] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 626.494833] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] for vif in network_info: [ 626.494833] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 626.494833] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] return self._sync_wrapper(fn, *args, **kwargs) [ 626.494833] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 626.494833] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] self.wait() [ 626.494833] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 626.494833] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] self[:] = self._gt.wait() [ 626.494833] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 626.494833] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] return self._exit_event.wait() [ 626.494833] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 626.494833] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] result = hub.switch() [ 626.495233] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 626.495233] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] return self.greenlet.switch() [ 626.495233] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 626.495233] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] result = function(*args, **kwargs) [ 626.495233] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 626.495233] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] return func(*args, **kwargs) [ 626.495233] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 626.495233] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] raise e [ 626.495233] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 626.495233] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] nwinfo = self.network_api.allocate_for_instance( [ 626.495233] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 626.495233] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] created_port_ids = self._update_ports_for_instance( [ 626.495233] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 626.495573] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] with excutils.save_and_reraise_exception(): [ 626.495573] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 626.495573] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] self.force_reraise() [ 626.495573] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 626.495573] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] raise self.value [ 626.495573] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 626.495573] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] updated_port = self._update_port( [ 626.495573] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 626.495573] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] _ensure_no_port_binding_failure(port) [ 626.495573] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 626.495573] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] raise exception.PortBindingFailed(port_id=port['id']) [ 626.495573] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] nova.exception.PortBindingFailed: Binding failed for port 00059401-5bcb-4b4e-a1d4-788f34da675b, please check neutron logs for more information. [ 626.495573] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] [ 626.495915] env[59534]: INFO nova.compute.manager [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] Terminating instance [ 626.504129] env[59534]: DEBUG oslo_concurrency.lockutils [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] Acquiring lock "refresh_cache-0f052eb8-59d7-4dcb-9d2f-bc7740424a66" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 626.504294] env[59534]: DEBUG oslo_concurrency.lockutils [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] Acquired lock "refresh_cache-0f052eb8-59d7-4dcb-9d2f-bc7740424a66" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 626.504455] env[59534]: DEBUG nova.network.neutron [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 626.546726] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-873dbcbf-114f-44ab-8992-c8f2676afafb {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 626.557391] env[59534]: DEBUG oslo_concurrency.lockutils [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] Acquiring lock "58622c1f-054c-454b-a288-f544fe883157" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 626.557626] env[59534]: DEBUG oslo_concurrency.lockutils [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] Lock "58622c1f-054c-454b-a288-f544fe883157" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 626.558659] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7b7d965-a398-4dd1-a795-455244108e4a {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 626.594125] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b183daf1-8cad-4efe-9ee4-e2d0a4891ded {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 626.603400] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c7c7920-4e24-4e6e-b841-aa96315c2ea2 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 626.620202] env[59534]: DEBUG nova.compute.provider_tree [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 626.629659] env[59534]: DEBUG nova.scheduler.client.report [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 626.648856] env[59534]: DEBUG oslo_concurrency.lockutils [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.321s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 626.649505] env[59534]: ERROR nova.compute.manager [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] [instance: f42f768c-09ad-44b6-b294-681f392bb483] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 01c27392-b745-429b-b2c0-4d8df0c419a8, please check neutron logs for more information. [ 626.649505] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] Traceback (most recent call last): [ 626.649505] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 626.649505] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] self.driver.spawn(context, instance, image_meta, [ 626.649505] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 626.649505] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] self._vmops.spawn(context, instance, image_meta, injected_files, [ 626.649505] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 626.649505] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] vm_ref = self.build_virtual_machine(instance, [ 626.649505] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 626.649505] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] vif_infos = vmwarevif.get_vif_info(self._session, [ 626.649505] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 626.650130] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] for vif in network_info: [ 626.650130] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 626.650130] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] return self._sync_wrapper(fn, *args, **kwargs) [ 626.650130] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 626.650130] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] self.wait() [ 626.650130] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 626.650130] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] self[:] = self._gt.wait() [ 626.650130] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 626.650130] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] return self._exit_event.wait() [ 626.650130] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 626.650130] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] result = hub.switch() [ 626.650130] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 626.650130] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] return self.greenlet.switch() [ 626.650130] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 626.651727] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] result = function(*args, **kwargs) [ 626.651727] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 626.651727] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] return func(*args, **kwargs) [ 626.651727] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 626.651727] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] raise e [ 626.651727] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 626.651727] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] nwinfo = self.network_api.allocate_for_instance( [ 626.651727] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 626.651727] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] created_port_ids = self._update_ports_for_instance( [ 626.651727] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 626.651727] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] with excutils.save_and_reraise_exception(): [ 626.651727] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 626.651727] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] self.force_reraise() [ 626.652292] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 626.652292] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] raise self.value [ 626.652292] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 626.652292] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] updated_port = self._update_port( [ 626.652292] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 626.652292] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] _ensure_no_port_binding_failure(port) [ 626.652292] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 626.652292] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] raise exception.PortBindingFailed(port_id=port['id']) [ 626.652292] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] nova.exception.PortBindingFailed: Binding failed for port 01c27392-b745-429b-b2c0-4d8df0c419a8, please check neutron logs for more information. [ 626.652292] env[59534]: ERROR nova.compute.manager [instance: f42f768c-09ad-44b6-b294-681f392bb483] [ 626.652963] env[59534]: DEBUG nova.compute.utils [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] [instance: f42f768c-09ad-44b6-b294-681f392bb483] Binding failed for port 01c27392-b745-429b-b2c0-4d8df0c419a8, please check neutron logs for more information. {{(pid=59534) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 626.652963] env[59534]: DEBUG oslo_concurrency.lockutils [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.271s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 626.659596] env[59534]: DEBUG nova.compute.manager [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] [instance: f42f768c-09ad-44b6-b294-681f392bb483] Build of instance f42f768c-09ad-44b6-b294-681f392bb483 was re-scheduled: Binding failed for port 01c27392-b745-429b-b2c0-4d8df0c419a8, please check neutron logs for more information. {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 626.659596] env[59534]: DEBUG nova.compute.manager [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] [instance: f42f768c-09ad-44b6-b294-681f392bb483] Unplugging VIFs for instance {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 626.659697] env[59534]: DEBUG oslo_concurrency.lockutils [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] Acquiring lock "refresh_cache-f42f768c-09ad-44b6-b294-681f392bb483" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 626.659779] env[59534]: DEBUG oslo_concurrency.lockutils [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] Acquired lock "refresh_cache-f42f768c-09ad-44b6-b294-681f392bb483" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 626.660603] env[59534]: DEBUG nova.network.neutron [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] [instance: f42f768c-09ad-44b6-b294-681f392bb483] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 626.797192] env[59534]: DEBUG nova.network.neutron [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 626.797719] env[59534]: DEBUG nova.network.neutron [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] [instance: f42f768c-09ad-44b6-b294-681f392bb483] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 626.874087] env[59534]: DEBUG nova.network.neutron [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] [instance: c4837d87-be47-4a47-9703-f73eeddb0053] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 626.890562] env[59534]: DEBUG oslo_concurrency.lockutils [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] Releasing lock "refresh_cache-c4837d87-be47-4a47-9703-f73eeddb0053" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 626.891145] env[59534]: DEBUG nova.compute.manager [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] [instance: c4837d87-be47-4a47-9703-f73eeddb0053] Start destroying the instance on the hypervisor. {{(pid=59534) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 626.891445] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] [instance: c4837d87-be47-4a47-9703-f73eeddb0053] Destroying instance {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 626.892163] env[59534]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-ea377e10-4938-4f47-88be-159ef03ffa03 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 626.909076] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7eb76ba-c2d2-45a5-91ad-a78bfb8ee681 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 626.938498] env[59534]: WARNING nova.virt.vmwareapi.vmops [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] [instance: c4837d87-be47-4a47-9703-f73eeddb0053] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance c4837d87-be47-4a47-9703-f73eeddb0053 could not be found. [ 626.938780] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] [instance: c4837d87-be47-4a47-9703-f73eeddb0053] Instance destroyed {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 626.941020] env[59534]: INFO nova.compute.manager [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] [instance: c4837d87-be47-4a47-9703-f73eeddb0053] Took 0.05 seconds to destroy the instance on the hypervisor. [ 626.941020] env[59534]: DEBUG oslo.service.loopingcall [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59534) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 626.941020] env[59534]: DEBUG nova.compute.manager [-] [instance: c4837d87-be47-4a47-9703-f73eeddb0053] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 626.941020] env[59534]: DEBUG nova.network.neutron [-] [instance: c4837d87-be47-4a47-9703-f73eeddb0053] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 626.960122] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53f995da-6244-4699-a4b7-29ffb2a9b77b {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 626.968056] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1da3a093-af5e-4beb-b370-97af01cfbc25 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.003309] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-70a2849e-2554-48c1-8069-daf071eb611d {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.011683] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49f3007b-2826-45d1-b6ea-a4c66483e59e {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.024861] env[59534]: DEBUG nova.compute.provider_tree [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 627.034072] env[59534]: DEBUG nova.scheduler.client.report [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 627.049760] env[59534]: DEBUG oslo_concurrency.lockutils [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.398s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 627.050425] env[59534]: ERROR nova.compute.manager [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 06045837-635b-4a38-bd57-9b1d16b054c1, please check neutron logs for more information. [ 627.050425] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] Traceback (most recent call last): [ 627.050425] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 627.050425] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] self.driver.spawn(context, instance, image_meta, [ 627.050425] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 627.050425] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 627.050425] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 627.050425] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] vm_ref = self.build_virtual_machine(instance, [ 627.050425] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 627.050425] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] vif_infos = vmwarevif.get_vif_info(self._session, [ 627.050425] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 627.050833] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] for vif in network_info: [ 627.050833] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 627.050833] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] return self._sync_wrapper(fn, *args, **kwargs) [ 627.050833] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 627.050833] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] self.wait() [ 627.050833] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 627.050833] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] self[:] = self._gt.wait() [ 627.050833] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 627.050833] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] return self._exit_event.wait() [ 627.050833] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 627.050833] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] result = hub.switch() [ 627.050833] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 627.050833] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] return self.greenlet.switch() [ 627.050833] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 627.051186] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] result = function(*args, **kwargs) [ 627.051186] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 627.051186] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] return func(*args, **kwargs) [ 627.051186] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 627.051186] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] raise e [ 627.051186] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 627.051186] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] nwinfo = self.network_api.allocate_for_instance( [ 627.051186] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 627.051186] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] created_port_ids = self._update_ports_for_instance( [ 627.051186] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 627.051186] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] with excutils.save_and_reraise_exception(): [ 627.051186] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 627.051186] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] self.force_reraise() [ 627.051497] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 627.051497] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] raise self.value [ 627.051497] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 627.051497] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] updated_port = self._update_port( [ 627.051497] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 627.051497] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] _ensure_no_port_binding_failure(port) [ 627.051497] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 627.051497] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] raise exception.PortBindingFailed(port_id=port['id']) [ 627.051497] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] nova.exception.PortBindingFailed: Binding failed for port 06045837-635b-4a38-bd57-9b1d16b054c1, please check neutron logs for more information. [ 627.051497] env[59534]: ERROR nova.compute.manager [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] [ 627.051497] env[59534]: DEBUG nova.compute.utils [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] Binding failed for port 06045837-635b-4a38-bd57-9b1d16b054c1, please check neutron logs for more information. {{(pid=59534) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 627.052735] env[59534]: DEBUG nova.compute.manager [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] Build of instance 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2 was re-scheduled: Binding failed for port 06045837-635b-4a38-bd57-9b1d16b054c1, please check neutron logs for more information. {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 627.052994] env[59534]: DEBUG nova.compute.manager [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] Unplugging VIFs for instance {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 627.053223] env[59534]: DEBUG oslo_concurrency.lockutils [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] Acquiring lock "refresh_cache-8b13a5ee-6538-4eb9-9ae6-b3a960a958f2" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 627.053361] env[59534]: DEBUG oslo_concurrency.lockutils [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] Acquired lock "refresh_cache-8b13a5ee-6538-4eb9-9ae6-b3a960a958f2" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 627.053604] env[59534]: DEBUG nova.network.neutron [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 627.065671] env[59534]: DEBUG nova.network.neutron [-] [instance: c4837d87-be47-4a47-9703-f73eeddb0053] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 627.072675] env[59534]: DEBUG nova.network.neutron [-] [instance: c4837d87-be47-4a47-9703-f73eeddb0053] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 627.081385] env[59534]: INFO nova.compute.manager [-] [instance: c4837d87-be47-4a47-9703-f73eeddb0053] Took 0.14 seconds to deallocate network for instance. [ 627.083273] env[59534]: DEBUG nova.compute.claims [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] [instance: c4837d87-be47-4a47-9703-f73eeddb0053] Aborting claim: {{(pid=59534) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 627.086110] env[59534]: DEBUG oslo_concurrency.lockutils [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 627.086110] env[59534]: DEBUG oslo_concurrency.lockutils [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 627.100494] env[59534]: DEBUG nova.network.neutron [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] Successfully created port: 9dbb0ab7-5a0a-4878-850b-2d43c31d668f {{(pid=59534) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 627.168203] env[59534]: DEBUG nova.network.neutron [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 627.304882] env[59534]: DEBUG nova.network.neutron [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] [instance: f42f768c-09ad-44b6-b294-681f392bb483] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 627.317275] env[59534]: DEBUG oslo_concurrency.lockutils [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] Releasing lock "refresh_cache-f42f768c-09ad-44b6-b294-681f392bb483" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 627.317617] env[59534]: DEBUG nova.compute.manager [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 627.317677] env[59534]: DEBUG nova.compute.manager [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] [instance: f42f768c-09ad-44b6-b294-681f392bb483] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 627.317809] env[59534]: DEBUG nova.network.neutron [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] [instance: f42f768c-09ad-44b6-b294-681f392bb483] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 627.352329] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-533cbe4f-4459-4a8f-b5c2-7f87c077422b {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.364161] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43f1f253-6626-4652-aa82-179d9c662d2d {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.394528] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f9fb601-2fa2-4bf6-ba84-ac1745fb0373 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.401752] env[59534]: DEBUG nova.network.neutron [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] [instance: f42f768c-09ad-44b6-b294-681f392bb483] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 627.403911] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ddbdc15a-8659-4821-adf0-b1ed08633dac {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.410692] env[59534]: DEBUG nova.network.neutron [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] [instance: f42f768c-09ad-44b6-b294-681f392bb483] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 627.423325] env[59534]: DEBUG nova.compute.provider_tree [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 627.432031] env[59534]: INFO nova.compute.manager [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] [instance: f42f768c-09ad-44b6-b294-681f392bb483] Took 0.11 seconds to deallocate network for instance. [ 627.435300] env[59534]: DEBUG nova.scheduler.client.report [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 627.455735] env[59534]: DEBUG oslo_concurrency.lockutils [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.372s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 627.456609] env[59534]: ERROR nova.compute.manager [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] [instance: c4837d87-be47-4a47-9703-f73eeddb0053] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port b6831d1f-22a2-4cdf-84ad-ea32b3046dce, please check neutron logs for more information. [ 627.456609] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] Traceback (most recent call last): [ 627.456609] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 627.456609] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] self.driver.spawn(context, instance, image_meta, [ 627.456609] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 627.456609] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] self._vmops.spawn(context, instance, image_meta, injected_files, [ 627.456609] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 627.456609] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] vm_ref = self.build_virtual_machine(instance, [ 627.456609] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 627.456609] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] vif_infos = vmwarevif.get_vif_info(self._session, [ 627.456609] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 627.456972] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] for vif in network_info: [ 627.456972] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 627.456972] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] return self._sync_wrapper(fn, *args, **kwargs) [ 627.456972] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 627.456972] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] self.wait() [ 627.456972] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 627.456972] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] self[:] = self._gt.wait() [ 627.456972] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 627.456972] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] return self._exit_event.wait() [ 627.456972] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 627.456972] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] result = hub.switch() [ 627.456972] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 627.456972] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] return self.greenlet.switch() [ 627.456972] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 627.457344] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] result = function(*args, **kwargs) [ 627.457344] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 627.457344] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] return func(*args, **kwargs) [ 627.457344] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 627.457344] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] raise e [ 627.457344] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 627.457344] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] nwinfo = self.network_api.allocate_for_instance( [ 627.457344] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 627.457344] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] created_port_ids = self._update_ports_for_instance( [ 627.457344] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 627.457344] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] with excutils.save_and_reraise_exception(): [ 627.457344] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 627.457344] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] self.force_reraise() [ 627.457678] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 627.457678] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] raise self.value [ 627.457678] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 627.457678] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] updated_port = self._update_port( [ 627.457678] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 627.457678] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] _ensure_no_port_binding_failure(port) [ 627.457678] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 627.457678] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] raise exception.PortBindingFailed(port_id=port['id']) [ 627.457678] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] nova.exception.PortBindingFailed: Binding failed for port b6831d1f-22a2-4cdf-84ad-ea32b3046dce, please check neutron logs for more information. [ 627.457678] env[59534]: ERROR nova.compute.manager [instance: c4837d87-be47-4a47-9703-f73eeddb0053] [ 627.458171] env[59534]: DEBUG nova.compute.utils [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] [instance: c4837d87-be47-4a47-9703-f73eeddb0053] Binding failed for port b6831d1f-22a2-4cdf-84ad-ea32b3046dce, please check neutron logs for more information. {{(pid=59534) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 627.459949] env[59534]: DEBUG nova.compute.manager [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] [instance: c4837d87-be47-4a47-9703-f73eeddb0053] Build of instance c4837d87-be47-4a47-9703-f73eeddb0053 was re-scheduled: Binding failed for port b6831d1f-22a2-4cdf-84ad-ea32b3046dce, please check neutron logs for more information. {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 627.460664] env[59534]: DEBUG nova.compute.manager [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] [instance: c4837d87-be47-4a47-9703-f73eeddb0053] Unplugging VIFs for instance {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 627.460664] env[59534]: DEBUG oslo_concurrency.lockutils [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] Acquiring lock "refresh_cache-c4837d87-be47-4a47-9703-f73eeddb0053" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 627.460776] env[59534]: DEBUG oslo_concurrency.lockutils [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] Acquired lock "refresh_cache-c4837d87-be47-4a47-9703-f73eeddb0053" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 627.460893] env[59534]: DEBUG nova.network.neutron [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] [instance: c4837d87-be47-4a47-9703-f73eeddb0053] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 627.491924] env[59534]: DEBUG nova.network.neutron [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 627.503929] env[59534]: DEBUG oslo_concurrency.lockutils [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] Releasing lock "refresh_cache-0f052eb8-59d7-4dcb-9d2f-bc7740424a66" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 627.503929] env[59534]: DEBUG nova.compute.manager [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] Start destroying the instance on the hypervisor. {{(pid=59534) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 627.503929] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] Destroying instance {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 627.507814] env[59534]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f1644d24-b7a9-4600-be2d-9f92fb59470d {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.520229] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0bcdbffe-15de-482b-a72f-3c35aa91cef3 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.551815] env[59534]: WARNING nova.virt.vmwareapi.vmops [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 0f052eb8-59d7-4dcb-9d2f-bc7740424a66 could not be found. [ 627.551941] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] Instance destroyed {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 627.552651] env[59534]: INFO nova.compute.manager [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] Took 0.05 seconds to destroy the instance on the hypervisor. [ 627.552651] env[59534]: DEBUG oslo.service.loopingcall [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59534) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 627.553401] env[59534]: ERROR nova.compute.manager [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 01c3c143-df9f-46cc-89b1-8070df187a5d, please check neutron logs for more information. [ 627.553401] env[59534]: ERROR nova.compute.manager Traceback (most recent call last): [ 627.553401] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 627.553401] env[59534]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 627.553401] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 627.553401] env[59534]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 627.553401] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 627.553401] env[59534]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 627.553401] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 627.553401] env[59534]: ERROR nova.compute.manager self.force_reraise() [ 627.553401] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 627.553401] env[59534]: ERROR nova.compute.manager raise self.value [ 627.553401] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 627.553401] env[59534]: ERROR nova.compute.manager updated_port = self._update_port( [ 627.553401] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 627.553401] env[59534]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 627.557140] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 627.557140] env[59534]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 627.557140] env[59534]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 01c3c143-df9f-46cc-89b1-8070df187a5d, please check neutron logs for more information. [ 627.557140] env[59534]: ERROR nova.compute.manager [ 627.557140] env[59534]: Traceback (most recent call last): [ 627.557140] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 627.557140] env[59534]: listener.cb(fileno) [ 627.557140] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 627.557140] env[59534]: result = function(*args, **kwargs) [ 627.557140] env[59534]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 627.557140] env[59534]: return func(*args, **kwargs) [ 627.557140] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 627.557140] env[59534]: raise e [ 627.557140] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 627.557140] env[59534]: nwinfo = self.network_api.allocate_for_instance( [ 627.557140] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 627.557140] env[59534]: created_port_ids = self._update_ports_for_instance( [ 627.557140] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 627.557140] env[59534]: with excutils.save_and_reraise_exception(): [ 627.557140] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 627.557140] env[59534]: self.force_reraise() [ 627.557140] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 627.557140] env[59534]: raise self.value [ 627.557140] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 627.557140] env[59534]: updated_port = self._update_port( [ 627.557140] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 627.557140] env[59534]: _ensure_no_port_binding_failure(port) [ 627.557140] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 627.557140] env[59534]: raise exception.PortBindingFailed(port_id=port['id']) [ 627.558343] env[59534]: nova.exception.PortBindingFailed: Binding failed for port 01c3c143-df9f-46cc-89b1-8070df187a5d, please check neutron logs for more information. [ 627.558343] env[59534]: Removing descriptor: 16 [ 627.558343] env[59534]: DEBUG nova.compute.manager [-] [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 627.558343] env[59534]: DEBUG nova.network.neutron [-] [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 627.558343] env[59534]: ERROR nova.compute.manager [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 01c3c143-df9f-46cc-89b1-8070df187a5d, please check neutron logs for more information. [ 627.558343] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] Traceback (most recent call last): [ 627.558343] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 627.558343] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] yield resources [ 627.558343] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 627.558343] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] self.driver.spawn(context, instance, image_meta, [ 627.558343] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 627.558627] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] self._vmops.spawn(context, instance, image_meta, injected_files, [ 627.558627] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 627.558627] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] vm_ref = self.build_virtual_machine(instance, [ 627.558627] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 627.558627] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] vif_infos = vmwarevif.get_vif_info(self._session, [ 627.558627] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 627.558627] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] for vif in network_info: [ 627.558627] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 627.558627] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] return self._sync_wrapper(fn, *args, **kwargs) [ 627.558627] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 627.558627] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] self.wait() [ 627.558627] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 627.558627] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] self[:] = self._gt.wait() [ 627.558965] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 627.558965] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] return self._exit_event.wait() [ 627.558965] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 627.558965] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] result = hub.switch() [ 627.558965] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 627.558965] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] return self.greenlet.switch() [ 627.558965] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 627.558965] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] result = function(*args, **kwargs) [ 627.558965] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 627.558965] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] return func(*args, **kwargs) [ 627.558965] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 627.558965] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] raise e [ 627.558965] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 627.559302] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] nwinfo = self.network_api.allocate_for_instance( [ 627.559302] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 627.559302] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] created_port_ids = self._update_ports_for_instance( [ 627.559302] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 627.559302] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] with excutils.save_and_reraise_exception(): [ 627.559302] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 627.559302] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] self.force_reraise() [ 627.559302] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 627.559302] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] raise self.value [ 627.559302] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 627.559302] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] updated_port = self._update_port( [ 627.559302] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 627.559302] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] _ensure_no_port_binding_failure(port) [ 627.559629] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 627.559629] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] raise exception.PortBindingFailed(port_id=port['id']) [ 627.559629] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] nova.exception.PortBindingFailed: Binding failed for port 01c3c143-df9f-46cc-89b1-8070df187a5d, please check neutron logs for more information. [ 627.559629] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] [ 627.559629] env[59534]: INFO nova.compute.manager [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] Terminating instance [ 627.559629] env[59534]: DEBUG oslo_concurrency.lockutils [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] Acquiring lock "refresh_cache-734b416a-659c-451b-82ea-0b8a2796fffd" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 627.559629] env[59534]: DEBUG oslo_concurrency.lockutils [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] Acquired lock "refresh_cache-734b416a-659c-451b-82ea-0b8a2796fffd" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 627.559811] env[59534]: DEBUG nova.network.neutron [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 627.564292] env[59534]: INFO nova.scheduler.client.report [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] Deleted allocations for instance f42f768c-09ad-44b6-b294-681f392bb483 [ 627.593231] env[59534]: DEBUG oslo_concurrency.lockutils [None req-110abea0-59f5-4dcf-8b58-430bf3044949 tempest-ServerDiagnosticsNegativeTest-350562321 tempest-ServerDiagnosticsNegativeTest-350562321-project-member] Lock "f42f768c-09ad-44b6-b294-681f392bb483" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 19.440s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 627.630628] env[59534]: DEBUG nova.compute.manager [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] [instance: 58622c1f-054c-454b-a288-f544fe883157] Starting instance... {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 627.706833] env[59534]: DEBUG nova.network.neutron [-] [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 627.716294] env[59534]: DEBUG oslo_concurrency.lockutils [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 627.716406] env[59534]: DEBUG oslo_concurrency.lockutils [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 627.717972] env[59534]: INFO nova.compute.claims [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] [instance: 58622c1f-054c-454b-a288-f544fe883157] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 627.722782] env[59534]: DEBUG nova.network.neutron [-] [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 627.736643] env[59534]: INFO nova.compute.manager [-] [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] Took 0.18 seconds to deallocate network for instance. [ 627.740762] env[59534]: DEBUG nova.compute.claims [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] Aborting claim: {{(pid=59534) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 627.740901] env[59534]: DEBUG oslo_concurrency.lockutils [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 627.796631] env[59534]: DEBUG nova.network.neutron [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 627.805517] env[59534]: DEBUG nova.network.neutron [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] [instance: c4837d87-be47-4a47-9703-f73eeddb0053] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 627.811864] env[59534]: DEBUG oslo_concurrency.lockutils [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] Releasing lock "refresh_cache-8b13a5ee-6538-4eb9-9ae6-b3a960a958f2" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 627.812095] env[59534]: DEBUG nova.compute.manager [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 627.812253] env[59534]: DEBUG nova.compute.manager [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 627.812412] env[59534]: DEBUG nova.network.neutron [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 628.022905] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-539fa54d-cd7f-4eb2-b3b0-728e80166568 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 628.026521] env[59534]: DEBUG nova.network.neutron [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 628.035462] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c90eca1-1b65-4b5b-9216-f5e7eed5940f {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 628.075113] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7035ca1b-6001-4936-a227-09187ecbad60 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 628.083969] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec909173-43e8-421a-846a-42a75f552f11 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 628.099597] env[59534]: DEBUG nova.compute.provider_tree [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 628.109149] env[59534]: DEBUG nova.network.neutron [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] Successfully created port: 9290e007-9d01-4b34-a972-ef5cbd7ff2c7 {{(pid=59534) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 628.117447] env[59534]: DEBUG nova.scheduler.client.report [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 628.142378] env[59534]: DEBUG oslo_concurrency.lockutils [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.426s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 628.142860] env[59534]: DEBUG nova.compute.manager [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] [instance: 58622c1f-054c-454b-a288-f544fe883157] Start building networks asynchronously for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 628.146071] env[59534]: DEBUG nova.network.neutron [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 628.149559] env[59534]: DEBUG oslo_concurrency.lockutils [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.409s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 628.159132] env[59534]: DEBUG nova.network.neutron [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 628.180187] env[59534]: INFO nova.compute.manager [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] [instance: 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2] Took 0.37 seconds to deallocate network for instance. [ 628.226707] env[59534]: DEBUG nova.compute.utils [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] Using /dev/sd instead of None {{(pid=59534) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 628.230945] env[59534]: DEBUG nova.compute.manager [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] [instance: 58622c1f-054c-454b-a288-f544fe883157] Allocating IP information in the background. {{(pid=59534) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 628.230945] env[59534]: DEBUG nova.network.neutron [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] [instance: 58622c1f-054c-454b-a288-f544fe883157] allocate_for_instance() {{(pid=59534) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 628.244368] env[59534]: DEBUG nova.compute.manager [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] [instance: 58622c1f-054c-454b-a288-f544fe883157] Start building block device mappings for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 628.309288] env[59534]: INFO nova.scheduler.client.report [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] Deleted allocations for instance 8b13a5ee-6538-4eb9-9ae6-b3a960a958f2 [ 628.348874] env[59534]: DEBUG oslo_concurrency.lockutils [None req-ad1f4f1e-c5b5-4af2-9dcf-668c5ccb84cf tempest-DeleteServersAdminTestJSON-500150000 tempest-DeleteServersAdminTestJSON-500150000-project-member] Lock "8b13a5ee-6538-4eb9-9ae6-b3a960a958f2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 20.407s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 628.350706] env[59534]: DEBUG nova.compute.manager [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] [instance: 58622c1f-054c-454b-a288-f544fe883157] Start spawning the instance on the hypervisor. {{(pid=59534) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 628.383702] env[59534]: DEBUG nova.virt.hardware [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T19:44:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T19:44:03Z,direct_url=,disk_format='vmdk',id=ca8542a2-3ba7-4624-b2be-cd49a340ac21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='31a329da9a7b4c98a2734a8492bf7dec',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T19:44:04Z,virtual_size=,visibility=), allow threads: False {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 628.383962] env[59534]: DEBUG nova.virt.hardware [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] Flavor limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 628.384120] env[59534]: DEBUG nova.virt.hardware [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] Image limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 628.384302] env[59534]: DEBUG nova.virt.hardware [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] Flavor pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 628.384443] env[59534]: DEBUG nova.virt.hardware [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] Image pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 628.384578] env[59534]: DEBUG nova.virt.hardware [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 628.384822] env[59534]: DEBUG nova.virt.hardware [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 628.384978] env[59534]: DEBUG nova.virt.hardware [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 628.385156] env[59534]: DEBUG nova.virt.hardware [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] Got 1 possible topologies {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 628.385308] env[59534]: DEBUG nova.virt.hardware [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 628.385467] env[59534]: DEBUG nova.virt.hardware [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 628.386647] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-001546f4-8843-4af0-a9eb-cee280c99d15 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 628.400586] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb63f4a2-f582-4153-8a84-e52266bb1298 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 628.451527] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9aa9247c-b4cc-488f-9780-0a35de7379c8 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 628.459017] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65448c69-08dd-4b2c-a621-dab583e2a048 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 628.493489] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a0ccd6a-f7a0-48b5-9d39-fb94f302fa4b {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 628.501387] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a05b094f-a6e1-4f60-9105-b60d4efd0369 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 628.515234] env[59534]: DEBUG nova.compute.provider_tree [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 628.526423] env[59534]: DEBUG nova.scheduler.client.report [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 628.544329] env[59534]: DEBUG oslo_concurrency.lockutils [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.395s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 628.545160] env[59534]: ERROR nova.compute.manager [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 00059401-5bcb-4b4e-a1d4-788f34da675b, please check neutron logs for more information. [ 628.545160] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] Traceback (most recent call last): [ 628.545160] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 628.545160] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] self.driver.spawn(context, instance, image_meta, [ 628.545160] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 628.545160] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] self._vmops.spawn(context, instance, image_meta, injected_files, [ 628.545160] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 628.545160] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] vm_ref = self.build_virtual_machine(instance, [ 628.545160] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 628.545160] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] vif_infos = vmwarevif.get_vif_info(self._session, [ 628.545160] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 628.545662] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] for vif in network_info: [ 628.545662] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 628.545662] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] return self._sync_wrapper(fn, *args, **kwargs) [ 628.545662] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 628.545662] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] self.wait() [ 628.545662] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 628.545662] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] self[:] = self._gt.wait() [ 628.545662] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 628.545662] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] return self._exit_event.wait() [ 628.545662] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 628.545662] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] result = hub.switch() [ 628.545662] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 628.545662] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] return self.greenlet.switch() [ 628.545662] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 628.546199] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] result = function(*args, **kwargs) [ 628.546199] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 628.546199] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] return func(*args, **kwargs) [ 628.546199] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 628.546199] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] raise e [ 628.546199] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 628.546199] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] nwinfo = self.network_api.allocate_for_instance( [ 628.546199] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 628.546199] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] created_port_ids = self._update_ports_for_instance( [ 628.546199] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 628.546199] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] with excutils.save_and_reraise_exception(): [ 628.546199] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 628.546199] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] self.force_reraise() [ 628.546545] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 628.546545] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] raise self.value [ 628.546545] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 628.546545] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] updated_port = self._update_port( [ 628.546545] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 628.546545] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] _ensure_no_port_binding_failure(port) [ 628.546545] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 628.546545] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] raise exception.PortBindingFailed(port_id=port['id']) [ 628.546545] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] nova.exception.PortBindingFailed: Binding failed for port 00059401-5bcb-4b4e-a1d4-788f34da675b, please check neutron logs for more information. [ 628.546545] env[59534]: ERROR nova.compute.manager [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] [ 628.546807] env[59534]: DEBUG nova.compute.utils [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] Binding failed for port 00059401-5bcb-4b4e-a1d4-788f34da675b, please check neutron logs for more information. {{(pid=59534) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 628.548622] env[59534]: DEBUG nova.compute.manager [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] Build of instance 0f052eb8-59d7-4dcb-9d2f-bc7740424a66 was re-scheduled: Binding failed for port 00059401-5bcb-4b4e-a1d4-788f34da675b, please check neutron logs for more information. {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 628.548622] env[59534]: DEBUG nova.compute.manager [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] Unplugging VIFs for instance {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 628.548622] env[59534]: DEBUG oslo_concurrency.lockutils [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] Acquiring lock "refresh_cache-0f052eb8-59d7-4dcb-9d2f-bc7740424a66" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 628.548622] env[59534]: DEBUG oslo_concurrency.lockutils [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] Acquired lock "refresh_cache-0f052eb8-59d7-4dcb-9d2f-bc7740424a66" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 628.549284] env[59534]: DEBUG nova.network.neutron [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 628.616712] env[59534]: DEBUG nova.network.neutron [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 628.629019] env[59534]: DEBUG oslo_concurrency.lockutils [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] Releasing lock "refresh_cache-734b416a-659c-451b-82ea-0b8a2796fffd" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 628.629019] env[59534]: DEBUG nova.compute.manager [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] Start destroying the instance on the hypervisor. {{(pid=59534) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 628.629019] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] Destroying instance {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 628.629019] env[59534]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-eb1c926e-ffed-4229-abfd-08077b76c688 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 628.639558] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73c98e13-05bf-46b6-93df-2258870f2d00 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 628.664557] env[59534]: WARNING nova.virt.vmwareapi.vmops [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 734b416a-659c-451b-82ea-0b8a2796fffd could not be found. [ 628.664557] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] Instance destroyed {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 628.665332] env[59534]: INFO nova.compute.manager [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] Took 0.04 seconds to destroy the instance on the hypervisor. [ 628.665332] env[59534]: DEBUG oslo.service.loopingcall [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59534) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 628.665731] env[59534]: DEBUG nova.compute.manager [-] [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 628.665731] env[59534]: DEBUG nova.network.neutron [-] [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 628.673272] env[59534]: DEBUG nova.policy [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5cd99c6785514a7e86a0e6f00e127369', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '29492fb1a4a3464ba0b3904562c4df99', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59534) authorize /opt/stack/nova/nova/policy.py:203}} [ 628.680222] env[59534]: DEBUG nova.network.neutron [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] [instance: c4837d87-be47-4a47-9703-f73eeddb0053] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 628.686517] env[59534]: DEBUG nova.network.neutron [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 628.694867] env[59534]: DEBUG oslo_concurrency.lockutils [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] Releasing lock "refresh_cache-c4837d87-be47-4a47-9703-f73eeddb0053" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 628.696110] env[59534]: DEBUG nova.compute.manager [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 628.696110] env[59534]: DEBUG nova.compute.manager [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] [instance: c4837d87-be47-4a47-9703-f73eeddb0053] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 628.696110] env[59534]: DEBUG nova.network.neutron [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] [instance: c4837d87-be47-4a47-9703-f73eeddb0053] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 628.745354] env[59534]: DEBUG nova.network.neutron [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] [instance: c4837d87-be47-4a47-9703-f73eeddb0053] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 628.747815] env[59534]: DEBUG nova.network.neutron [-] [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 628.754030] env[59534]: DEBUG nova.network.neutron [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] [instance: c4837d87-be47-4a47-9703-f73eeddb0053] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 628.757317] env[59534]: DEBUG nova.network.neutron [-] [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 628.766897] env[59534]: INFO nova.compute.manager [-] [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] Took 0.10 seconds to deallocate network for instance. [ 628.769997] env[59534]: DEBUG nova.compute.claims [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] Aborting claim: {{(pid=59534) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 628.770185] env[59534]: DEBUG oslo_concurrency.lockutils [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 628.772663] env[59534]: DEBUG oslo_concurrency.lockutils [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 628.774548] env[59534]: INFO nova.compute.manager [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] [instance: c4837d87-be47-4a47-9703-f73eeddb0053] Took 0.08 seconds to deallocate network for instance. [ 628.873063] env[59534]: INFO nova.scheduler.client.report [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] Deleted allocations for instance c4837d87-be47-4a47-9703-f73eeddb0053 [ 628.907210] env[59534]: DEBUG oslo_concurrency.lockutils [None req-00e7fe48-33d0-4189-8cac-e50a02e11e2d tempest-MigrationsAdminTest-2033893428 tempest-MigrationsAdminTest-2033893428-project-member] Lock "c4837d87-be47-4a47-9703-f73eeddb0053" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 19.578s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 628.988318] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a59ee153-60f3-49bb-a839-8780b92eca3b {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 628.996811] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7f2c733-ca7d-45eb-b61a-5e82b1aa7be1 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 629.028162] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc9e59ef-ea8c-4aea-b217-ed086a6070b9 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 629.035942] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a997bdc-5ddd-4f84-8334-2d0c1c757b48 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 629.050509] env[59534]: DEBUG nova.compute.provider_tree [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 629.058543] env[59534]: DEBUG nova.scheduler.client.report [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 629.073904] env[59534]: DEBUG oslo_concurrency.lockutils [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.303s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 629.074529] env[59534]: ERROR nova.compute.manager [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 01c3c143-df9f-46cc-89b1-8070df187a5d, please check neutron logs for more information. [ 629.074529] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] Traceback (most recent call last): [ 629.074529] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 629.074529] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] self.driver.spawn(context, instance, image_meta, [ 629.074529] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 629.074529] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] self._vmops.spawn(context, instance, image_meta, injected_files, [ 629.074529] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 629.074529] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] vm_ref = self.build_virtual_machine(instance, [ 629.074529] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 629.074529] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] vif_infos = vmwarevif.get_vif_info(self._session, [ 629.074529] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 629.075424] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] for vif in network_info: [ 629.075424] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 629.075424] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] return self._sync_wrapper(fn, *args, **kwargs) [ 629.075424] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 629.075424] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] self.wait() [ 629.075424] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 629.075424] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] self[:] = self._gt.wait() [ 629.075424] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 629.075424] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] return self._exit_event.wait() [ 629.075424] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 629.075424] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] result = hub.switch() [ 629.075424] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 629.075424] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] return self.greenlet.switch() [ 629.075424] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 629.075831] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] result = function(*args, **kwargs) [ 629.075831] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 629.075831] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] return func(*args, **kwargs) [ 629.075831] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 629.075831] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] raise e [ 629.075831] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 629.075831] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] nwinfo = self.network_api.allocate_for_instance( [ 629.075831] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 629.075831] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] created_port_ids = self._update_ports_for_instance( [ 629.075831] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 629.075831] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] with excutils.save_and_reraise_exception(): [ 629.075831] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 629.075831] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] self.force_reraise() [ 629.076202] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 629.076202] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] raise self.value [ 629.076202] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 629.076202] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] updated_port = self._update_port( [ 629.076202] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 629.076202] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] _ensure_no_port_binding_failure(port) [ 629.076202] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 629.076202] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] raise exception.PortBindingFailed(port_id=port['id']) [ 629.076202] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] nova.exception.PortBindingFailed: Binding failed for port 01c3c143-df9f-46cc-89b1-8070df187a5d, please check neutron logs for more information. [ 629.076202] env[59534]: ERROR nova.compute.manager [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] [ 629.076202] env[59534]: DEBUG nova.compute.utils [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] Binding failed for port 01c3c143-df9f-46cc-89b1-8070df187a5d, please check neutron logs for more information. {{(pid=59534) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 629.076721] env[59534]: DEBUG nova.compute.manager [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] Build of instance 734b416a-659c-451b-82ea-0b8a2796fffd was re-scheduled: Binding failed for port 01c3c143-df9f-46cc-89b1-8070df187a5d, please check neutron logs for more information. {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 629.077158] env[59534]: DEBUG nova.compute.manager [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] Unplugging VIFs for instance {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 629.080023] env[59534]: DEBUG oslo_concurrency.lockutils [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] Acquiring lock "refresh_cache-734b416a-659c-451b-82ea-0b8a2796fffd" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 629.080023] env[59534]: DEBUG oslo_concurrency.lockutils [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] Acquired lock "refresh_cache-734b416a-659c-451b-82ea-0b8a2796fffd" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 629.080023] env[59534]: DEBUG nova.network.neutron [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 629.226124] env[59534]: DEBUG nova.network.neutron [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 629.657835] env[59534]: DEBUG nova.network.neutron [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 629.674248] env[59534]: DEBUG oslo_concurrency.lockutils [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] Releasing lock "refresh_cache-0f052eb8-59d7-4dcb-9d2f-bc7740424a66" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 629.674248] env[59534]: DEBUG nova.compute.manager [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 629.674248] env[59534]: DEBUG nova.compute.manager [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 629.674248] env[59534]: DEBUG nova.network.neutron [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 629.738074] env[59534]: DEBUG nova.network.neutron [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 629.749019] env[59534]: DEBUG oslo_concurrency.lockutils [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] Releasing lock "refresh_cache-734b416a-659c-451b-82ea-0b8a2796fffd" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 629.749019] env[59534]: DEBUG nova.compute.manager [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 629.749019] env[59534]: DEBUG nova.compute.manager [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 629.749019] env[59534]: DEBUG nova.network.neutron [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 629.757271] env[59534]: DEBUG nova.network.neutron [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 629.765562] env[59534]: DEBUG nova.network.neutron [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 629.780379] env[59534]: INFO nova.compute.manager [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] [instance: 0f052eb8-59d7-4dcb-9d2f-bc7740424a66] Took 0.10 seconds to deallocate network for instance. [ 629.814622] env[59534]: DEBUG nova.network.neutron [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 629.824926] env[59534]: DEBUG nova.network.neutron [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 629.836694] env[59534]: INFO nova.compute.manager [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] [instance: 734b416a-659c-451b-82ea-0b8a2796fffd] Took 0.09 seconds to deallocate network for instance. [ 629.896054] env[59534]: INFO nova.scheduler.client.report [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] Deleted allocations for instance 0f052eb8-59d7-4dcb-9d2f-bc7740424a66 [ 629.924395] env[59534]: DEBUG oslo_concurrency.lockutils [None req-78ffdc23-194c-4ad7-a295-19e4795469cc tempest-FloatingIPsAssociationNegativeTestJSON-1832406944 tempest-FloatingIPsAssociationNegativeTestJSON-1832406944-project-member] Lock "0f052eb8-59d7-4dcb-9d2f-bc7740424a66" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 20.107s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 629.955702] env[59534]: INFO nova.scheduler.client.report [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] Deleted allocations for instance 734b416a-659c-451b-82ea-0b8a2796fffd [ 629.974905] env[59534]: DEBUG oslo_concurrency.lockutils [None req-63bcbf14-9acb-489d-a5ba-ae97bee82d0d tempest-ServersTestManualDisk-123449131 tempest-ServersTestManualDisk-123449131-project-member] Lock "734b416a-659c-451b-82ea-0b8a2796fffd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 16.439s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 630.638028] env[59534]: DEBUG nova.network.neutron [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] [instance: 58622c1f-054c-454b-a288-f544fe883157] Successfully created port: 87fcf100-d495-4ce5-a821-bc64c3aabd53 {{(pid=59534) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 631.621437] env[59534]: ERROR nova.compute.manager [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 0cc5d8d9-60b1-46fb-a63d-d39eb9741a8a, please check neutron logs for more information. [ 631.621437] env[59534]: ERROR nova.compute.manager Traceback (most recent call last): [ 631.621437] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 631.621437] env[59534]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 631.621437] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 631.621437] env[59534]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 631.621437] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 631.621437] env[59534]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 631.621437] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 631.621437] env[59534]: ERROR nova.compute.manager self.force_reraise() [ 631.621437] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 631.621437] env[59534]: ERROR nova.compute.manager raise self.value [ 631.621437] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 631.621437] env[59534]: ERROR nova.compute.manager updated_port = self._update_port( [ 631.621437] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 631.621437] env[59534]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 631.622106] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 631.622106] env[59534]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 631.622106] env[59534]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 0cc5d8d9-60b1-46fb-a63d-d39eb9741a8a, please check neutron logs for more information. [ 631.622106] env[59534]: ERROR nova.compute.manager [ 631.622106] env[59534]: Traceback (most recent call last): [ 631.622106] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 631.622106] env[59534]: listener.cb(fileno) [ 631.622106] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 631.622106] env[59534]: result = function(*args, **kwargs) [ 631.622106] env[59534]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 631.622106] env[59534]: return func(*args, **kwargs) [ 631.622106] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 631.622106] env[59534]: raise e [ 631.622106] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 631.622106] env[59534]: nwinfo = self.network_api.allocate_for_instance( [ 631.622106] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 631.622106] env[59534]: created_port_ids = self._update_ports_for_instance( [ 631.622106] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 631.622106] env[59534]: with excutils.save_and_reraise_exception(): [ 631.622106] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 631.622106] env[59534]: self.force_reraise() [ 631.622106] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 631.622106] env[59534]: raise self.value [ 631.622106] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 631.622106] env[59534]: updated_port = self._update_port( [ 631.622106] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 631.622106] env[59534]: _ensure_no_port_binding_failure(port) [ 631.622106] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 631.622106] env[59534]: raise exception.PortBindingFailed(port_id=port['id']) [ 631.622947] env[59534]: nova.exception.PortBindingFailed: Binding failed for port 0cc5d8d9-60b1-46fb-a63d-d39eb9741a8a, please check neutron logs for more information. [ 631.622947] env[59534]: Removing descriptor: 17 [ 631.622947] env[59534]: ERROR nova.compute.manager [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 0cc5d8d9-60b1-46fb-a63d-d39eb9741a8a, please check neutron logs for more information. [ 631.622947] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] Traceback (most recent call last): [ 631.622947] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 631.622947] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] yield resources [ 631.622947] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 631.622947] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] self.driver.spawn(context, instance, image_meta, [ 631.622947] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 631.622947] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] self._vmops.spawn(context, instance, image_meta, injected_files, [ 631.622947] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 631.622947] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] vm_ref = self.build_virtual_machine(instance, [ 631.623411] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 631.623411] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] vif_infos = vmwarevif.get_vif_info(self._session, [ 631.623411] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 631.623411] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] for vif in network_info: [ 631.623411] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 631.623411] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] return self._sync_wrapper(fn, *args, **kwargs) [ 631.623411] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 631.623411] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] self.wait() [ 631.623411] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 631.623411] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] self[:] = self._gt.wait() [ 631.623411] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 631.623411] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] return self._exit_event.wait() [ 631.623411] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 631.623411] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] result = hub.switch() [ 631.623887] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 631.623887] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] return self.greenlet.switch() [ 631.623887] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 631.623887] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] result = function(*args, **kwargs) [ 631.623887] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 631.623887] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] return func(*args, **kwargs) [ 631.623887] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 631.623887] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] raise e [ 631.623887] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 631.623887] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] nwinfo = self.network_api.allocate_for_instance( [ 631.623887] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 631.623887] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] created_port_ids = self._update_ports_for_instance( [ 631.623887] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 631.624259] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] with excutils.save_and_reraise_exception(): [ 631.624259] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 631.624259] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] self.force_reraise() [ 631.624259] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 631.624259] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] raise self.value [ 631.624259] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 631.624259] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] updated_port = self._update_port( [ 631.624259] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 631.624259] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] _ensure_no_port_binding_failure(port) [ 631.624259] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 631.624259] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] raise exception.PortBindingFailed(port_id=port['id']) [ 631.624259] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] nova.exception.PortBindingFailed: Binding failed for port 0cc5d8d9-60b1-46fb-a63d-d39eb9741a8a, please check neutron logs for more information. [ 631.624259] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] [ 631.624630] env[59534]: INFO nova.compute.manager [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] Terminating instance [ 631.624630] env[59534]: DEBUG oslo_concurrency.lockutils [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] Acquiring lock "refresh_cache-4a6f2391-5c17-452d-baf2-c4c62a2dde72" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 631.624630] env[59534]: DEBUG oslo_concurrency.lockutils [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] Acquired lock "refresh_cache-4a6f2391-5c17-452d-baf2-c4c62a2dde72" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 631.624798] env[59534]: DEBUG nova.network.neutron [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 631.726830] env[59534]: DEBUG nova.network.neutron [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 632.259483] env[59534]: DEBUG nova.network.neutron [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 632.269498] env[59534]: DEBUG oslo_concurrency.lockutils [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] Releasing lock "refresh_cache-4a6f2391-5c17-452d-baf2-c4c62a2dde72" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 632.269945] env[59534]: DEBUG nova.compute.manager [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] Start destroying the instance on the hypervisor. {{(pid=59534) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 632.270173] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] Destroying instance {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 632.270851] env[59534]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-628ffb38-8608-44a8-83a5-879bb3d00b6f {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 632.281592] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-360cf1e5-82e9-4ca4-9219-7122da735c95 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 632.310017] env[59534]: WARNING nova.virt.vmwareapi.vmops [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 4a6f2391-5c17-452d-baf2-c4c62a2dde72 could not be found. [ 632.310065] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] Instance destroyed {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 632.310467] env[59534]: INFO nova.compute.manager [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] Took 0.04 seconds to destroy the instance on the hypervisor. [ 632.310570] env[59534]: DEBUG oslo.service.loopingcall [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59534) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 632.311065] env[59534]: DEBUG nova.compute.manager [-] [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 632.311175] env[59534]: DEBUG nova.network.neutron [-] [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 632.379450] env[59534]: DEBUG nova.network.neutron [-] [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 632.389378] env[59534]: DEBUG nova.network.neutron [-] [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 632.400127] env[59534]: INFO nova.compute.manager [-] [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] Took 0.09 seconds to deallocate network for instance. [ 632.403182] env[59534]: DEBUG nova.compute.claims [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] Aborting claim: {{(pid=59534) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 632.403361] env[59534]: DEBUG oslo_concurrency.lockutils [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 632.410021] env[59534]: DEBUG oslo_concurrency.lockutils [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 632.564596] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a921a0ab-73e2-4497-8e70-1d25c5a32022 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 632.571986] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f1ab64b-506e-4275-a1e2-74ff43e00aad {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 632.605863] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7b35d8c-9ad0-44e3-9bf4-6b3af41068e9 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 632.616079] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-231fcf72-c6b8-4826-a678-51a2e116ab1f {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 632.631308] env[59534]: DEBUG nova.compute.provider_tree [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 632.648489] env[59534]: DEBUG nova.scheduler.client.report [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 632.667277] env[59534]: DEBUG oslo_concurrency.lockutils [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.263s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 632.668066] env[59534]: ERROR nova.compute.manager [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 0cc5d8d9-60b1-46fb-a63d-d39eb9741a8a, please check neutron logs for more information. [ 632.668066] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] Traceback (most recent call last): [ 632.668066] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 632.668066] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] self.driver.spawn(context, instance, image_meta, [ 632.668066] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 632.668066] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] self._vmops.spawn(context, instance, image_meta, injected_files, [ 632.668066] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 632.668066] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] vm_ref = self.build_virtual_machine(instance, [ 632.668066] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 632.668066] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] vif_infos = vmwarevif.get_vif_info(self._session, [ 632.668066] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 632.669396] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] for vif in network_info: [ 632.669396] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 632.669396] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] return self._sync_wrapper(fn, *args, **kwargs) [ 632.669396] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 632.669396] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] self.wait() [ 632.669396] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 632.669396] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] self[:] = self._gt.wait() [ 632.669396] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 632.669396] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] return self._exit_event.wait() [ 632.669396] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 632.669396] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] result = hub.switch() [ 632.669396] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 632.669396] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] return self.greenlet.switch() [ 632.669396] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 632.670169] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] result = function(*args, **kwargs) [ 632.670169] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 632.670169] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] return func(*args, **kwargs) [ 632.670169] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 632.670169] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] raise e [ 632.670169] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 632.670169] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] nwinfo = self.network_api.allocate_for_instance( [ 632.670169] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 632.670169] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] created_port_ids = self._update_ports_for_instance( [ 632.670169] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 632.670169] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] with excutils.save_and_reraise_exception(): [ 632.670169] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 632.670169] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] self.force_reraise() [ 632.670862] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 632.670862] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] raise self.value [ 632.670862] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 632.670862] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] updated_port = self._update_port( [ 632.670862] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 632.670862] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] _ensure_no_port_binding_failure(port) [ 632.670862] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 632.670862] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] raise exception.PortBindingFailed(port_id=port['id']) [ 632.670862] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] nova.exception.PortBindingFailed: Binding failed for port 0cc5d8d9-60b1-46fb-a63d-d39eb9741a8a, please check neutron logs for more information. [ 632.670862] env[59534]: ERROR nova.compute.manager [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] [ 632.670862] env[59534]: DEBUG nova.compute.utils [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] Binding failed for port 0cc5d8d9-60b1-46fb-a63d-d39eb9741a8a, please check neutron logs for more information. {{(pid=59534) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 632.672458] env[59534]: DEBUG nova.compute.manager [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] Build of instance 4a6f2391-5c17-452d-baf2-c4c62a2dde72 was re-scheduled: Binding failed for port 0cc5d8d9-60b1-46fb-a63d-d39eb9741a8a, please check neutron logs for more information. {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 632.672458] env[59534]: DEBUG nova.compute.manager [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] Unplugging VIFs for instance {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 632.672458] env[59534]: DEBUG oslo_concurrency.lockutils [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] Acquiring lock "refresh_cache-4a6f2391-5c17-452d-baf2-c4c62a2dde72" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 632.672458] env[59534]: DEBUG oslo_concurrency.lockutils [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] Acquired lock "refresh_cache-4a6f2391-5c17-452d-baf2-c4c62a2dde72" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 632.672800] env[59534]: DEBUG nova.network.neutron [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 632.761707] env[59534]: DEBUG nova.network.neutron [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 632.841629] env[59534]: ERROR nova.compute.manager [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 88088e2d-f713-4fc1-93a7-61174147040f, please check neutron logs for more information. [ 632.841629] env[59534]: ERROR nova.compute.manager Traceback (most recent call last): [ 632.841629] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 632.841629] env[59534]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 632.841629] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 632.841629] env[59534]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 632.841629] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 632.841629] env[59534]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 632.841629] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 632.841629] env[59534]: ERROR nova.compute.manager self.force_reraise() [ 632.841629] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 632.841629] env[59534]: ERROR nova.compute.manager raise self.value [ 632.841629] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 632.841629] env[59534]: ERROR nova.compute.manager updated_port = self._update_port( [ 632.841629] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 632.841629] env[59534]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 632.842055] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 632.842055] env[59534]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 632.842055] env[59534]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 88088e2d-f713-4fc1-93a7-61174147040f, please check neutron logs for more information. [ 632.842055] env[59534]: ERROR nova.compute.manager [ 632.842055] env[59534]: Traceback (most recent call last): [ 632.842055] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 632.842055] env[59534]: listener.cb(fileno) [ 632.842055] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 632.842055] env[59534]: result = function(*args, **kwargs) [ 632.842055] env[59534]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 632.842055] env[59534]: return func(*args, **kwargs) [ 632.842055] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 632.842055] env[59534]: raise e [ 632.842055] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 632.842055] env[59534]: nwinfo = self.network_api.allocate_for_instance( [ 632.842055] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 632.842055] env[59534]: created_port_ids = self._update_ports_for_instance( [ 632.842055] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 632.842055] env[59534]: with excutils.save_and_reraise_exception(): [ 632.842055] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 632.842055] env[59534]: self.force_reraise() [ 632.842055] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 632.842055] env[59534]: raise self.value [ 632.842055] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 632.842055] env[59534]: updated_port = self._update_port( [ 632.842055] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 632.842055] env[59534]: _ensure_no_port_binding_failure(port) [ 632.842055] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 632.842055] env[59534]: raise exception.PortBindingFailed(port_id=port['id']) [ 632.842833] env[59534]: nova.exception.PortBindingFailed: Binding failed for port 88088e2d-f713-4fc1-93a7-61174147040f, please check neutron logs for more information. [ 632.842833] env[59534]: Removing descriptor: 18 [ 632.842833] env[59534]: ERROR nova.compute.manager [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 88088e2d-f713-4fc1-93a7-61174147040f, please check neutron logs for more information. [ 632.842833] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] Traceback (most recent call last): [ 632.842833] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 632.842833] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] yield resources [ 632.842833] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 632.842833] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] self.driver.spawn(context, instance, image_meta, [ 632.842833] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 632.842833] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] self._vmops.spawn(context, instance, image_meta, injected_files, [ 632.842833] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 632.842833] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] vm_ref = self.build_virtual_machine(instance, [ 632.843155] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 632.843155] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] vif_infos = vmwarevif.get_vif_info(self._session, [ 632.843155] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 632.843155] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] for vif in network_info: [ 632.843155] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 632.843155] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] return self._sync_wrapper(fn, *args, **kwargs) [ 632.843155] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 632.843155] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] self.wait() [ 632.843155] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 632.843155] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] self[:] = self._gt.wait() [ 632.843155] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 632.843155] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] return self._exit_event.wait() [ 632.843155] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 632.843155] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] result = hub.switch() [ 632.843562] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 632.843562] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] return self.greenlet.switch() [ 632.843562] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 632.843562] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] result = function(*args, **kwargs) [ 632.843562] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 632.843562] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] return func(*args, **kwargs) [ 632.843562] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 632.843562] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] raise e [ 632.843562] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 632.843562] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] nwinfo = self.network_api.allocate_for_instance( [ 632.843562] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 632.843562] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] created_port_ids = self._update_ports_for_instance( [ 632.843562] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 632.843975] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] with excutils.save_and_reraise_exception(): [ 632.843975] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 632.843975] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] self.force_reraise() [ 632.843975] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 632.843975] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] raise self.value [ 632.843975] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 632.843975] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] updated_port = self._update_port( [ 632.843975] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 632.843975] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] _ensure_no_port_binding_failure(port) [ 632.843975] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 632.843975] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] raise exception.PortBindingFailed(port_id=port['id']) [ 632.843975] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] nova.exception.PortBindingFailed: Binding failed for port 88088e2d-f713-4fc1-93a7-61174147040f, please check neutron logs for more information. [ 632.843975] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] [ 632.844547] env[59534]: INFO nova.compute.manager [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] Terminating instance [ 632.847787] env[59534]: DEBUG oslo_concurrency.lockutils [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] Acquiring lock "refresh_cache-533ba77f-7191-4af2-b3c7-204efa001ffa" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 632.847934] env[59534]: DEBUG oslo_concurrency.lockutils [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] Acquired lock "refresh_cache-533ba77f-7191-4af2-b3c7-204efa001ffa" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 632.848103] env[59534]: DEBUG nova.network.neutron [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 632.945506] env[59534]: DEBUG nova.network.neutron [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 633.029236] env[59534]: ERROR nova.compute.manager [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port bef7ea21-2ba0-4ccb-a8b4-4fc5da944d3d, please check neutron logs for more information. [ 633.029236] env[59534]: ERROR nova.compute.manager Traceback (most recent call last): [ 633.029236] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 633.029236] env[59534]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 633.029236] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 633.029236] env[59534]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 633.029236] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 633.029236] env[59534]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 633.029236] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 633.029236] env[59534]: ERROR nova.compute.manager self.force_reraise() [ 633.029236] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 633.029236] env[59534]: ERROR nova.compute.manager raise self.value [ 633.029236] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 633.029236] env[59534]: ERROR nova.compute.manager updated_port = self._update_port( [ 633.029236] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 633.029236] env[59534]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 633.029762] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 633.029762] env[59534]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 633.029762] env[59534]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port bef7ea21-2ba0-4ccb-a8b4-4fc5da944d3d, please check neutron logs for more information. [ 633.029762] env[59534]: ERROR nova.compute.manager [ 633.029762] env[59534]: Traceback (most recent call last): [ 633.029762] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 633.029762] env[59534]: listener.cb(fileno) [ 633.029762] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 633.029762] env[59534]: result = function(*args, **kwargs) [ 633.029762] env[59534]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 633.029762] env[59534]: return func(*args, **kwargs) [ 633.029762] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 633.029762] env[59534]: raise e [ 633.029762] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 633.029762] env[59534]: nwinfo = self.network_api.allocate_for_instance( [ 633.029762] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 633.029762] env[59534]: created_port_ids = self._update_ports_for_instance( [ 633.029762] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 633.029762] env[59534]: with excutils.save_and_reraise_exception(): [ 633.029762] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 633.029762] env[59534]: self.force_reraise() [ 633.029762] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 633.029762] env[59534]: raise self.value [ 633.029762] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 633.029762] env[59534]: updated_port = self._update_port( [ 633.029762] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 633.029762] env[59534]: _ensure_no_port_binding_failure(port) [ 633.029762] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 633.029762] env[59534]: raise exception.PortBindingFailed(port_id=port['id']) [ 633.030825] env[59534]: nova.exception.PortBindingFailed: Binding failed for port bef7ea21-2ba0-4ccb-a8b4-4fc5da944d3d, please check neutron logs for more information. [ 633.030825] env[59534]: Removing descriptor: 19 [ 633.030825] env[59534]: ERROR nova.compute.manager [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] [instance: e105adff-476c-46b0-b795-daf44b69ef3a] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port bef7ea21-2ba0-4ccb-a8b4-4fc5da944d3d, please check neutron logs for more information. [ 633.030825] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] Traceback (most recent call last): [ 633.030825] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 633.030825] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] yield resources [ 633.030825] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 633.030825] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] self.driver.spawn(context, instance, image_meta, [ 633.030825] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 633.030825] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 633.030825] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 633.030825] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] vm_ref = self.build_virtual_machine(instance, [ 633.031268] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 633.031268] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] vif_infos = vmwarevif.get_vif_info(self._session, [ 633.031268] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 633.031268] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] for vif in network_info: [ 633.031268] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 633.031268] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] return self._sync_wrapper(fn, *args, **kwargs) [ 633.031268] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 633.031268] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] self.wait() [ 633.031268] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 633.031268] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] self[:] = self._gt.wait() [ 633.031268] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 633.031268] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] return self._exit_event.wait() [ 633.031268] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 633.031268] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] result = hub.switch() [ 633.031681] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 633.031681] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] return self.greenlet.switch() [ 633.031681] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 633.031681] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] result = function(*args, **kwargs) [ 633.031681] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 633.031681] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] return func(*args, **kwargs) [ 633.031681] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 633.031681] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] raise e [ 633.031681] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 633.031681] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] nwinfo = self.network_api.allocate_for_instance( [ 633.031681] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 633.031681] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] created_port_ids = self._update_ports_for_instance( [ 633.031681] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 633.032166] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] with excutils.save_and_reraise_exception(): [ 633.032166] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 633.032166] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] self.force_reraise() [ 633.032166] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 633.032166] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] raise self.value [ 633.032166] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 633.032166] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] updated_port = self._update_port( [ 633.032166] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 633.032166] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] _ensure_no_port_binding_failure(port) [ 633.032166] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 633.032166] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] raise exception.PortBindingFailed(port_id=port['id']) [ 633.032166] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] nova.exception.PortBindingFailed: Binding failed for port bef7ea21-2ba0-4ccb-a8b4-4fc5da944d3d, please check neutron logs for more information. [ 633.032166] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] [ 633.032519] env[59534]: INFO nova.compute.manager [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] [instance: e105adff-476c-46b0-b795-daf44b69ef3a] Terminating instance [ 633.037388] env[59534]: DEBUG oslo_concurrency.lockutils [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] Acquiring lock "refresh_cache-e105adff-476c-46b0-b795-daf44b69ef3a" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 633.037388] env[59534]: DEBUG oslo_concurrency.lockutils [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] Acquired lock "refresh_cache-e105adff-476c-46b0-b795-daf44b69ef3a" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 633.037388] env[59534]: DEBUG nova.network.neutron [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] [instance: e105adff-476c-46b0-b795-daf44b69ef3a] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 633.108213] env[59534]: DEBUG nova.network.neutron [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] [instance: e105adff-476c-46b0-b795-daf44b69ef3a] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 633.359831] env[59534]: DEBUG nova.network.neutron [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 633.369282] env[59534]: DEBUG oslo_concurrency.lockutils [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] Releasing lock "refresh_cache-4a6f2391-5c17-452d-baf2-c4c62a2dde72" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 633.369551] env[59534]: DEBUG nova.compute.manager [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 633.369664] env[59534]: DEBUG nova.compute.manager [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 633.369808] env[59534]: DEBUG nova.network.neutron [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 633.448444] env[59534]: DEBUG nova.network.neutron [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] [instance: e105adff-476c-46b0-b795-daf44b69ef3a] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 633.457593] env[59534]: DEBUG nova.network.neutron [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 633.465215] env[59534]: DEBUG oslo_concurrency.lockutils [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] Releasing lock "refresh_cache-e105adff-476c-46b0-b795-daf44b69ef3a" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 633.466082] env[59534]: DEBUG nova.compute.manager [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] [instance: e105adff-476c-46b0-b795-daf44b69ef3a] Start destroying the instance on the hypervisor. {{(pid=59534) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 633.468319] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] [instance: e105adff-476c-46b0-b795-daf44b69ef3a] Destroying instance {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 633.468398] env[59534]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b039172e-1f93-445c-a289-94d7f7afde36 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.472043] env[59534]: DEBUG nova.network.neutron [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 633.481315] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f719c6a-84e6-4a0f-b6e8-3e1c015a0d13 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.494714] env[59534]: INFO nova.compute.manager [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] [instance: 4a6f2391-5c17-452d-baf2-c4c62a2dde72] Took 0.12 seconds to deallocate network for instance. [ 633.512532] env[59534]: WARNING nova.virt.vmwareapi.vmops [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] [instance: e105adff-476c-46b0-b795-daf44b69ef3a] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance e105adff-476c-46b0-b795-daf44b69ef3a could not be found. [ 633.512881] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] [instance: e105adff-476c-46b0-b795-daf44b69ef3a] Instance destroyed {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 633.513076] env[59534]: INFO nova.compute.manager [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] [instance: e105adff-476c-46b0-b795-daf44b69ef3a] Took 0.05 seconds to destroy the instance on the hypervisor. [ 633.513317] env[59534]: DEBUG oslo.service.loopingcall [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59534) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 633.513523] env[59534]: DEBUG nova.compute.manager [-] [instance: e105adff-476c-46b0-b795-daf44b69ef3a] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 633.513610] env[59534]: DEBUG nova.network.neutron [-] [instance: e105adff-476c-46b0-b795-daf44b69ef3a] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 633.553290] env[59534]: DEBUG nova.network.neutron [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 633.563023] env[59534]: DEBUG oslo_concurrency.lockutils [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] Releasing lock "refresh_cache-533ba77f-7191-4af2-b3c7-204efa001ffa" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 633.563425] env[59534]: DEBUG nova.compute.manager [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] Start destroying the instance on the hypervisor. {{(pid=59534) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 633.566258] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] Destroying instance {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 633.566258] env[59534]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-528e4f1f-4d53-4e76-8dc8-cd7a506e3b94 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.567546] env[59534]: DEBUG nova.network.neutron [-] [instance: e105adff-476c-46b0-b795-daf44b69ef3a] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 633.578783] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01b82f63-5f4b-41fd-9364-fa5e379cbcde {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.589363] env[59534]: DEBUG nova.network.neutron [-] [instance: e105adff-476c-46b0-b795-daf44b69ef3a] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 633.602494] env[59534]: WARNING nova.virt.vmwareapi.vmops [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 533ba77f-7191-4af2-b3c7-204efa001ffa could not be found. [ 633.602707] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] Instance destroyed {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 633.602880] env[59534]: INFO nova.compute.manager [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] Took 0.04 seconds to destroy the instance on the hypervisor. [ 633.603125] env[59534]: DEBUG oslo.service.loopingcall [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59534) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 633.604583] env[59534]: DEBUG nova.compute.manager [-] [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 633.604583] env[59534]: DEBUG nova.network.neutron [-] [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 633.606758] env[59534]: INFO nova.compute.manager [-] [instance: e105adff-476c-46b0-b795-daf44b69ef3a] Took 0.09 seconds to deallocate network for instance. [ 633.609291] env[59534]: DEBUG nova.compute.claims [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] [instance: e105adff-476c-46b0-b795-daf44b69ef3a] Aborting claim: {{(pid=59534) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 633.609688] env[59534]: DEBUG oslo_concurrency.lockutils [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 633.609688] env[59534]: DEBUG oslo_concurrency.lockutils [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 633.625967] env[59534]: INFO nova.scheduler.client.report [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] Deleted allocations for instance 4a6f2391-5c17-452d-baf2-c4c62a2dde72 [ 633.657113] env[59534]: DEBUG oslo_concurrency.lockutils [None req-6b8460d3-9b3c-4449-9a5b-f879c70ff2d5 tempest-ServerExternalEventsTest-1961432341 tempest-ServerExternalEventsTest-1961432341-project-member] Lock "4a6f2391-5c17-452d-baf2-c4c62a2dde72" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 17.152s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 633.666354] env[59534]: DEBUG nova.network.neutron [-] [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 633.680732] env[59534]: DEBUG nova.network.neutron [-] [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 633.692101] env[59534]: INFO nova.compute.manager [-] [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] Took 0.09 seconds to deallocate network for instance. [ 633.694143] env[59534]: DEBUG nova.compute.claims [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] Aborting claim: {{(pid=59534) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 633.694591] env[59534]: DEBUG oslo_concurrency.lockutils [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 633.763493] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9868e9e-daa9-4f98-ba34-dd98d08f2082 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.774128] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82af3718-3a70-49c5-84f0-b0978566d89a {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.808163] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3da280d0-1b2a-4783-8448-96c5ec9fbf4f {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.816527] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9b8fe70-020a-45c9-b991-19878e17c744 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.834589] env[59534]: DEBUG nova.compute.provider_tree [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 633.853188] env[59534]: DEBUG nova.scheduler.client.report [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 633.870209] env[59534]: DEBUG oslo_concurrency.lockutils [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.260s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 633.870838] env[59534]: ERROR nova.compute.manager [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] [instance: e105adff-476c-46b0-b795-daf44b69ef3a] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port bef7ea21-2ba0-4ccb-a8b4-4fc5da944d3d, please check neutron logs for more information. [ 633.870838] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] Traceback (most recent call last): [ 633.870838] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 633.870838] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] self.driver.spawn(context, instance, image_meta, [ 633.870838] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 633.870838] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 633.870838] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 633.870838] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] vm_ref = self.build_virtual_machine(instance, [ 633.870838] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 633.870838] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] vif_infos = vmwarevif.get_vif_info(self._session, [ 633.870838] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 633.871184] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] for vif in network_info: [ 633.871184] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 633.871184] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] return self._sync_wrapper(fn, *args, **kwargs) [ 633.871184] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 633.871184] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] self.wait() [ 633.871184] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 633.871184] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] self[:] = self._gt.wait() [ 633.871184] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 633.871184] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] return self._exit_event.wait() [ 633.871184] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 633.871184] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] result = hub.switch() [ 633.871184] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 633.871184] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] return self.greenlet.switch() [ 633.871184] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 633.871509] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] result = function(*args, **kwargs) [ 633.871509] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 633.871509] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] return func(*args, **kwargs) [ 633.871509] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 633.871509] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] raise e [ 633.871509] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 633.871509] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] nwinfo = self.network_api.allocate_for_instance( [ 633.871509] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 633.871509] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] created_port_ids = self._update_ports_for_instance( [ 633.871509] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 633.871509] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] with excutils.save_and_reraise_exception(): [ 633.871509] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 633.871509] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] self.force_reraise() [ 633.871825] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 633.871825] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] raise self.value [ 633.871825] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 633.871825] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] updated_port = self._update_port( [ 633.871825] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 633.871825] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] _ensure_no_port_binding_failure(port) [ 633.871825] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 633.871825] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] raise exception.PortBindingFailed(port_id=port['id']) [ 633.871825] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] nova.exception.PortBindingFailed: Binding failed for port bef7ea21-2ba0-4ccb-a8b4-4fc5da944d3d, please check neutron logs for more information. [ 633.871825] env[59534]: ERROR nova.compute.manager [instance: e105adff-476c-46b0-b795-daf44b69ef3a] [ 633.871825] env[59534]: DEBUG nova.compute.utils [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] [instance: e105adff-476c-46b0-b795-daf44b69ef3a] Binding failed for port bef7ea21-2ba0-4ccb-a8b4-4fc5da944d3d, please check neutron logs for more information. {{(pid=59534) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 633.873709] env[59534]: DEBUG oslo_concurrency.lockutils [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.179s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 633.881739] env[59534]: DEBUG nova.compute.manager [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] [instance: e105adff-476c-46b0-b795-daf44b69ef3a] Build of instance e105adff-476c-46b0-b795-daf44b69ef3a was re-scheduled: Binding failed for port bef7ea21-2ba0-4ccb-a8b4-4fc5da944d3d, please check neutron logs for more information. {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 633.882316] env[59534]: DEBUG nova.compute.manager [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] [instance: e105adff-476c-46b0-b795-daf44b69ef3a] Unplugging VIFs for instance {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 633.882399] env[59534]: DEBUG oslo_concurrency.lockutils [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] Acquiring lock "refresh_cache-e105adff-476c-46b0-b795-daf44b69ef3a" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 633.882508] env[59534]: DEBUG oslo_concurrency.lockutils [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] Acquired lock "refresh_cache-e105adff-476c-46b0-b795-daf44b69ef3a" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 633.882658] env[59534]: DEBUG nova.network.neutron [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] [instance: e105adff-476c-46b0-b795-daf44b69ef3a] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 633.941665] env[59534]: DEBUG nova.network.neutron [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] [instance: e105adff-476c-46b0-b795-daf44b69ef3a] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 634.029847] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77b2de7b-c570-4015-9c00-c51b26bf5b98 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.036946] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa790d58-0207-453f-aeb2-b4264041d8a7 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.071639] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bea97079-7edf-4abc-a646-5ff6c09f22d8 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.079393] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8346578-062c-42a5-8ec9-f63078229b1d {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.093737] env[59534]: DEBUG nova.compute.provider_tree [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 634.104662] env[59534]: DEBUG nova.scheduler.client.report [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 634.127131] env[59534]: DEBUG oslo_concurrency.lockutils [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.253s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 634.127923] env[59534]: ERROR nova.compute.manager [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 88088e2d-f713-4fc1-93a7-61174147040f, please check neutron logs for more information. [ 634.127923] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] Traceback (most recent call last): [ 634.127923] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 634.127923] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] self.driver.spawn(context, instance, image_meta, [ 634.127923] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 634.127923] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] self._vmops.spawn(context, instance, image_meta, injected_files, [ 634.127923] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 634.127923] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] vm_ref = self.build_virtual_machine(instance, [ 634.127923] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 634.127923] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] vif_infos = vmwarevif.get_vif_info(self._session, [ 634.127923] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 634.128288] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] for vif in network_info: [ 634.128288] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 634.128288] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] return self._sync_wrapper(fn, *args, **kwargs) [ 634.128288] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 634.128288] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] self.wait() [ 634.128288] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 634.128288] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] self[:] = self._gt.wait() [ 634.128288] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 634.128288] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] return self._exit_event.wait() [ 634.128288] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 634.128288] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] result = hub.switch() [ 634.128288] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 634.128288] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] return self.greenlet.switch() [ 634.128288] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 634.129599] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] result = function(*args, **kwargs) [ 634.129599] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 634.129599] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] return func(*args, **kwargs) [ 634.129599] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 634.129599] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] raise e [ 634.129599] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 634.129599] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] nwinfo = self.network_api.allocate_for_instance( [ 634.129599] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 634.129599] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] created_port_ids = self._update_ports_for_instance( [ 634.129599] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 634.129599] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] with excutils.save_and_reraise_exception(): [ 634.129599] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 634.129599] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] self.force_reraise() [ 634.129978] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 634.129978] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] raise self.value [ 634.129978] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 634.129978] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] updated_port = self._update_port( [ 634.129978] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 634.129978] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] _ensure_no_port_binding_failure(port) [ 634.129978] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 634.129978] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] raise exception.PortBindingFailed(port_id=port['id']) [ 634.129978] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] nova.exception.PortBindingFailed: Binding failed for port 88088e2d-f713-4fc1-93a7-61174147040f, please check neutron logs for more information. [ 634.129978] env[59534]: ERROR nova.compute.manager [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] [ 634.132051] env[59534]: DEBUG nova.compute.utils [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] Binding failed for port 88088e2d-f713-4fc1-93a7-61174147040f, please check neutron logs for more information. {{(pid=59534) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 634.132586] env[59534]: DEBUG nova.compute.manager [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] Build of instance 533ba77f-7191-4af2-b3c7-204efa001ffa was re-scheduled: Binding failed for port 88088e2d-f713-4fc1-93a7-61174147040f, please check neutron logs for more information. {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 634.133867] env[59534]: DEBUG nova.compute.manager [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] Unplugging VIFs for instance {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 634.133867] env[59534]: DEBUG oslo_concurrency.lockutils [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] Acquiring lock "refresh_cache-533ba77f-7191-4af2-b3c7-204efa001ffa" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 634.133867] env[59534]: DEBUG oslo_concurrency.lockutils [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] Acquired lock "refresh_cache-533ba77f-7191-4af2-b3c7-204efa001ffa" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 634.133867] env[59534]: DEBUG nova.network.neutron [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 634.246020] env[59534]: DEBUG nova.network.neutron [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 634.566333] env[59534]: DEBUG nova.network.neutron [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] [instance: e105adff-476c-46b0-b795-daf44b69ef3a] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 634.578705] env[59534]: DEBUG oslo_concurrency.lockutils [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] Releasing lock "refresh_cache-e105adff-476c-46b0-b795-daf44b69ef3a" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 634.578705] env[59534]: DEBUG nova.compute.manager [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 634.578705] env[59534]: DEBUG nova.compute.manager [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] [instance: e105adff-476c-46b0-b795-daf44b69ef3a] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 634.578705] env[59534]: DEBUG nova.network.neutron [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] [instance: e105adff-476c-46b0-b795-daf44b69ef3a] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 634.644741] env[59534]: DEBUG nova.network.neutron [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] [instance: e105adff-476c-46b0-b795-daf44b69ef3a] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 634.652880] env[59534]: DEBUG nova.network.neutron [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] [instance: e105adff-476c-46b0-b795-daf44b69ef3a] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 634.672949] env[59534]: INFO nova.compute.manager [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] [instance: e105adff-476c-46b0-b795-daf44b69ef3a] Took 0.10 seconds to deallocate network for instance. [ 634.746702] env[59534]: DEBUG nova.network.neutron [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 634.761024] env[59534]: DEBUG oslo_concurrency.lockutils [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] Releasing lock "refresh_cache-533ba77f-7191-4af2-b3c7-204efa001ffa" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 634.761264] env[59534]: DEBUG nova.compute.manager [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 634.761521] env[59534]: DEBUG nova.compute.manager [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 634.761616] env[59534]: DEBUG nova.network.neutron [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 634.794982] env[59534]: INFO nova.scheduler.client.report [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] Deleted allocations for instance e105adff-476c-46b0-b795-daf44b69ef3a [ 634.808107] env[59534]: DEBUG nova.network.neutron [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 634.821120] env[59534]: DEBUG oslo_concurrency.lockutils [None req-dab7f34a-f06e-4470-834c-72aa23df9b69 tempest-ServerActionsTestOtherB-998771733 tempest-ServerActionsTestOtherB-998771733-project-member] Lock "e105adff-476c-46b0-b795-daf44b69ef3a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 14.162s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 634.835992] env[59534]: DEBUG nova.network.neutron [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 634.855163] env[59534]: INFO nova.compute.manager [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] [instance: 533ba77f-7191-4af2-b3c7-204efa001ffa] Took 0.09 seconds to deallocate network for instance. [ 634.970349] env[59534]: INFO nova.scheduler.client.report [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] Deleted allocations for instance 533ba77f-7191-4af2-b3c7-204efa001ffa [ 635.006549] env[59534]: DEBUG oslo_concurrency.lockutils [None req-e877e246-ef14-4679-8dff-87af11337d4d tempest-ServerDiagnosticsTest-268993674 tempest-ServerDiagnosticsTest-268993674-project-member] Lock "533ba77f-7191-4af2-b3c7-204efa001ffa" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 16.504s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 635.055082] env[59534]: ERROR nova.compute.manager [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 9dbb0ab7-5a0a-4878-850b-2d43c31d668f, please check neutron logs for more information. [ 635.055082] env[59534]: ERROR nova.compute.manager Traceback (most recent call last): [ 635.055082] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 635.055082] env[59534]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 635.055082] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 635.055082] env[59534]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 635.055082] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 635.055082] env[59534]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 635.055082] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 635.055082] env[59534]: ERROR nova.compute.manager self.force_reraise() [ 635.055082] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 635.055082] env[59534]: ERROR nova.compute.manager raise self.value [ 635.055082] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 635.055082] env[59534]: ERROR nova.compute.manager updated_port = self._update_port( [ 635.055082] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 635.055082] env[59534]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 635.055583] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 635.055583] env[59534]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 635.055583] env[59534]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 9dbb0ab7-5a0a-4878-850b-2d43c31d668f, please check neutron logs for more information. [ 635.055583] env[59534]: ERROR nova.compute.manager [ 635.055583] env[59534]: Traceback (most recent call last): [ 635.055583] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 635.055583] env[59534]: listener.cb(fileno) [ 635.055583] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 635.055583] env[59534]: result = function(*args, **kwargs) [ 635.055583] env[59534]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 635.055583] env[59534]: return func(*args, **kwargs) [ 635.055583] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 635.055583] env[59534]: raise e [ 635.055583] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 635.055583] env[59534]: nwinfo = self.network_api.allocate_for_instance( [ 635.055583] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 635.055583] env[59534]: created_port_ids = self._update_ports_for_instance( [ 635.055583] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 635.055583] env[59534]: with excutils.save_and_reraise_exception(): [ 635.055583] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 635.055583] env[59534]: self.force_reraise() [ 635.055583] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 635.055583] env[59534]: raise self.value [ 635.055583] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 635.055583] env[59534]: updated_port = self._update_port( [ 635.055583] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 635.055583] env[59534]: _ensure_no_port_binding_failure(port) [ 635.055583] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 635.055583] env[59534]: raise exception.PortBindingFailed(port_id=port['id']) [ 635.056366] env[59534]: nova.exception.PortBindingFailed: Binding failed for port 9dbb0ab7-5a0a-4878-850b-2d43c31d668f, please check neutron logs for more information. [ 635.056366] env[59534]: Removing descriptor: 20 [ 635.056366] env[59534]: ERROR nova.compute.manager [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 9dbb0ab7-5a0a-4878-850b-2d43c31d668f, please check neutron logs for more information. [ 635.056366] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] Traceback (most recent call last): [ 635.056366] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 635.056366] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] yield resources [ 635.056366] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 635.056366] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] self.driver.spawn(context, instance, image_meta, [ 635.056366] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 635.056366] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] self._vmops.spawn(context, instance, image_meta, injected_files, [ 635.056366] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 635.056366] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] vm_ref = self.build_virtual_machine(instance, [ 635.056690] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 635.056690] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] vif_infos = vmwarevif.get_vif_info(self._session, [ 635.056690] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 635.056690] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] for vif in network_info: [ 635.056690] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 635.056690] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] return self._sync_wrapper(fn, *args, **kwargs) [ 635.056690] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 635.056690] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] self.wait() [ 635.056690] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 635.056690] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] self[:] = self._gt.wait() [ 635.056690] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 635.056690] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] return self._exit_event.wait() [ 635.056690] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 635.056690] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] result = hub.switch() [ 635.057097] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 635.057097] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] return self.greenlet.switch() [ 635.057097] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 635.057097] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] result = function(*args, **kwargs) [ 635.057097] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 635.057097] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] return func(*args, **kwargs) [ 635.057097] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 635.057097] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] raise e [ 635.057097] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 635.057097] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] nwinfo = self.network_api.allocate_for_instance( [ 635.057097] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 635.057097] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] created_port_ids = self._update_ports_for_instance( [ 635.057097] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 635.057551] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] with excutils.save_and_reraise_exception(): [ 635.057551] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 635.057551] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] self.force_reraise() [ 635.057551] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 635.057551] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] raise self.value [ 635.057551] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 635.057551] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] updated_port = self._update_port( [ 635.057551] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 635.057551] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] _ensure_no_port_binding_failure(port) [ 635.057551] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 635.057551] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] raise exception.PortBindingFailed(port_id=port['id']) [ 635.057551] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] nova.exception.PortBindingFailed: Binding failed for port 9dbb0ab7-5a0a-4878-850b-2d43c31d668f, please check neutron logs for more information. [ 635.057551] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] [ 635.057980] env[59534]: INFO nova.compute.manager [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] Terminating instance [ 635.062839] env[59534]: DEBUG oslo_concurrency.lockutils [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] Acquiring lock "refresh_cache-62cecc37-ce9f-42f6-8be2-efa724e94916" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 635.062839] env[59534]: DEBUG oslo_concurrency.lockutils [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] Acquired lock "refresh_cache-62cecc37-ce9f-42f6-8be2-efa724e94916" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 635.062839] env[59534]: DEBUG nova.network.neutron [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 635.129930] env[59534]: DEBUG nova.network.neutron [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 635.216851] env[59534]: DEBUG nova.network.neutron [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 635.231023] env[59534]: DEBUG oslo_concurrency.lockutils [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] Releasing lock "refresh_cache-62cecc37-ce9f-42f6-8be2-efa724e94916" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 635.231445] env[59534]: DEBUG nova.compute.manager [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] Start destroying the instance on the hypervisor. {{(pid=59534) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 635.231608] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] Destroying instance {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 635.232474] env[59534]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f1ed21f2-ed0a-4b60-ba89-3c50530da18e {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 635.245643] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08e03b2f-baf0-4c3f-b904-d2f62a7938ac {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 635.277801] env[59534]: WARNING nova.virt.vmwareapi.vmops [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 62cecc37-ce9f-42f6-8be2-efa724e94916 could not be found. [ 635.278070] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] Instance destroyed {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 635.278777] env[59534]: INFO nova.compute.manager [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] Took 0.05 seconds to destroy the instance on the hypervisor. [ 635.278939] env[59534]: DEBUG oslo.service.loopingcall [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59534) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 635.280159] env[59534]: DEBUG nova.compute.manager [-] [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 635.281109] env[59534]: DEBUG nova.network.neutron [-] [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 635.309270] env[59534]: DEBUG nova.network.neutron [-] [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 635.323531] env[59534]: DEBUG nova.network.neutron [-] [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 635.335998] env[59534]: ERROR nova.compute.manager [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 9290e007-9d01-4b34-a972-ef5cbd7ff2c7, please check neutron logs for more information. [ 635.335998] env[59534]: ERROR nova.compute.manager Traceback (most recent call last): [ 635.335998] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 635.335998] env[59534]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 635.335998] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 635.335998] env[59534]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 635.335998] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 635.335998] env[59534]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 635.335998] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 635.335998] env[59534]: ERROR nova.compute.manager self.force_reraise() [ 635.335998] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 635.335998] env[59534]: ERROR nova.compute.manager raise self.value [ 635.335998] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 635.335998] env[59534]: ERROR nova.compute.manager updated_port = self._update_port( [ 635.335998] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 635.335998] env[59534]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 635.336620] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 635.336620] env[59534]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 635.336620] env[59534]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 9290e007-9d01-4b34-a972-ef5cbd7ff2c7, please check neutron logs for more information. [ 635.336620] env[59534]: ERROR nova.compute.manager [ 635.336620] env[59534]: Traceback (most recent call last): [ 635.336620] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 635.336620] env[59534]: listener.cb(fileno) [ 635.336620] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 635.336620] env[59534]: result = function(*args, **kwargs) [ 635.336620] env[59534]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 635.336620] env[59534]: return func(*args, **kwargs) [ 635.336620] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 635.336620] env[59534]: raise e [ 635.336620] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 635.336620] env[59534]: nwinfo = self.network_api.allocate_for_instance( [ 635.336620] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 635.336620] env[59534]: created_port_ids = self._update_ports_for_instance( [ 635.336620] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 635.336620] env[59534]: with excutils.save_and_reraise_exception(): [ 635.336620] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 635.336620] env[59534]: self.force_reraise() [ 635.336620] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 635.336620] env[59534]: raise self.value [ 635.336620] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 635.336620] env[59534]: updated_port = self._update_port( [ 635.336620] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 635.336620] env[59534]: _ensure_no_port_binding_failure(port) [ 635.336620] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 635.336620] env[59534]: raise exception.PortBindingFailed(port_id=port['id']) [ 635.337432] env[59534]: nova.exception.PortBindingFailed: Binding failed for port 9290e007-9d01-4b34-a972-ef5cbd7ff2c7, please check neutron logs for more information. [ 635.337432] env[59534]: Removing descriptor: 21 [ 635.337432] env[59534]: ERROR nova.compute.manager [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 9290e007-9d01-4b34-a972-ef5cbd7ff2c7, please check neutron logs for more information. [ 635.337432] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] Traceback (most recent call last): [ 635.337432] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 635.337432] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] yield resources [ 635.337432] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 635.337432] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] self.driver.spawn(context, instance, image_meta, [ 635.337432] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 635.337432] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 635.337432] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 635.337432] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] vm_ref = self.build_virtual_machine(instance, [ 635.338313] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 635.338313] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] vif_infos = vmwarevif.get_vif_info(self._session, [ 635.338313] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 635.338313] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] for vif in network_info: [ 635.338313] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 635.338313] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] return self._sync_wrapper(fn, *args, **kwargs) [ 635.338313] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 635.338313] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] self.wait() [ 635.338313] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 635.338313] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] self[:] = self._gt.wait() [ 635.338313] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 635.338313] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] return self._exit_event.wait() [ 635.338313] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 635.338313] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] result = hub.switch() [ 635.339533] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 635.339533] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] return self.greenlet.switch() [ 635.339533] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 635.339533] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] result = function(*args, **kwargs) [ 635.339533] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 635.339533] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] return func(*args, **kwargs) [ 635.339533] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 635.339533] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] raise e [ 635.339533] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 635.339533] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] nwinfo = self.network_api.allocate_for_instance( [ 635.339533] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 635.339533] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] created_port_ids = self._update_ports_for_instance( [ 635.339533] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 635.341096] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] with excutils.save_and_reraise_exception(): [ 635.341096] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 635.341096] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] self.force_reraise() [ 635.341096] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 635.341096] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] raise self.value [ 635.341096] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 635.341096] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] updated_port = self._update_port( [ 635.341096] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 635.341096] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] _ensure_no_port_binding_failure(port) [ 635.341096] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 635.341096] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] raise exception.PortBindingFailed(port_id=port['id']) [ 635.341096] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] nova.exception.PortBindingFailed: Binding failed for port 9290e007-9d01-4b34-a972-ef5cbd7ff2c7, please check neutron logs for more information. [ 635.341096] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] [ 635.341771] env[59534]: INFO nova.compute.manager [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] Terminating instance [ 635.341771] env[59534]: DEBUG oslo_concurrency.lockutils [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] Acquiring lock "refresh_cache-89c6b8db-b87a-4a05-9fab-72eff91e4fe3" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 635.341771] env[59534]: DEBUG oslo_concurrency.lockutils [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] Acquired lock "refresh_cache-89c6b8db-b87a-4a05-9fab-72eff91e4fe3" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 635.341771] env[59534]: DEBUG nova.network.neutron [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 635.341771] env[59534]: INFO nova.compute.manager [-] [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] Took 0.06 seconds to deallocate network for instance. [ 635.343964] env[59534]: DEBUG nova.compute.claims [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] Aborting claim: {{(pid=59534) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 635.344753] env[59534]: DEBUG oslo_concurrency.lockutils [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 635.345013] env[59534]: DEBUG oslo_concurrency.lockutils [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 635.386798] env[59534]: DEBUG nova.network.neutron [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 635.430899] env[59534]: ERROR nova.compute.manager [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 87fcf100-d495-4ce5-a821-bc64c3aabd53, please check neutron logs for more information. [ 635.430899] env[59534]: ERROR nova.compute.manager Traceback (most recent call last): [ 635.430899] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 635.430899] env[59534]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 635.430899] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 635.430899] env[59534]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 635.430899] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 635.430899] env[59534]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 635.430899] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 635.430899] env[59534]: ERROR nova.compute.manager self.force_reraise() [ 635.430899] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 635.430899] env[59534]: ERROR nova.compute.manager raise self.value [ 635.430899] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 635.430899] env[59534]: ERROR nova.compute.manager updated_port = self._update_port( [ 635.430899] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 635.430899] env[59534]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 635.431406] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 635.431406] env[59534]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 635.431406] env[59534]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 87fcf100-d495-4ce5-a821-bc64c3aabd53, please check neutron logs for more information. [ 635.431406] env[59534]: ERROR nova.compute.manager [ 635.431406] env[59534]: Traceback (most recent call last): [ 635.431406] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 635.431406] env[59534]: listener.cb(fileno) [ 635.431406] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 635.431406] env[59534]: result = function(*args, **kwargs) [ 635.431406] env[59534]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 635.431406] env[59534]: return func(*args, **kwargs) [ 635.431406] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 635.431406] env[59534]: raise e [ 635.431406] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 635.431406] env[59534]: nwinfo = self.network_api.allocate_for_instance( [ 635.431406] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 635.431406] env[59534]: created_port_ids = self._update_ports_for_instance( [ 635.431406] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 635.431406] env[59534]: with excutils.save_and_reraise_exception(): [ 635.431406] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 635.431406] env[59534]: self.force_reraise() [ 635.431406] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 635.431406] env[59534]: raise self.value [ 635.431406] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 635.431406] env[59534]: updated_port = self._update_port( [ 635.431406] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 635.431406] env[59534]: _ensure_no_port_binding_failure(port) [ 635.431406] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 635.431406] env[59534]: raise exception.PortBindingFailed(port_id=port['id']) [ 635.432210] env[59534]: nova.exception.PortBindingFailed: Binding failed for port 87fcf100-d495-4ce5-a821-bc64c3aabd53, please check neutron logs for more information. [ 635.432210] env[59534]: Removing descriptor: 12 [ 635.432210] env[59534]: ERROR nova.compute.manager [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] [instance: 58622c1f-054c-454b-a288-f544fe883157] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 87fcf100-d495-4ce5-a821-bc64c3aabd53, please check neutron logs for more information. [ 635.432210] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] Traceback (most recent call last): [ 635.432210] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 635.432210] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] yield resources [ 635.432210] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 635.432210] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] self.driver.spawn(context, instance, image_meta, [ 635.432210] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 635.432210] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] self._vmops.spawn(context, instance, image_meta, injected_files, [ 635.432210] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 635.432210] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] vm_ref = self.build_virtual_machine(instance, [ 635.432586] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 635.432586] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] vif_infos = vmwarevif.get_vif_info(self._session, [ 635.432586] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 635.432586] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] for vif in network_info: [ 635.432586] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 635.432586] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] return self._sync_wrapper(fn, *args, **kwargs) [ 635.432586] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 635.432586] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] self.wait() [ 635.432586] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 635.432586] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] self[:] = self._gt.wait() [ 635.432586] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 635.432586] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] return self._exit_event.wait() [ 635.432586] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 635.432586] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] result = hub.switch() [ 635.432974] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 635.432974] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] return self.greenlet.switch() [ 635.432974] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 635.432974] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] result = function(*args, **kwargs) [ 635.432974] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 635.432974] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] return func(*args, **kwargs) [ 635.432974] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 635.432974] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] raise e [ 635.432974] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 635.432974] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] nwinfo = self.network_api.allocate_for_instance( [ 635.432974] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 635.432974] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] created_port_ids = self._update_ports_for_instance( [ 635.432974] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 635.435278] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] with excutils.save_and_reraise_exception(): [ 635.435278] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 635.435278] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] self.force_reraise() [ 635.435278] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 635.435278] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] raise self.value [ 635.435278] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 635.435278] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] updated_port = self._update_port( [ 635.435278] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 635.435278] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] _ensure_no_port_binding_failure(port) [ 635.435278] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 635.435278] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] raise exception.PortBindingFailed(port_id=port['id']) [ 635.435278] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] nova.exception.PortBindingFailed: Binding failed for port 87fcf100-d495-4ce5-a821-bc64c3aabd53, please check neutron logs for more information. [ 635.435278] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] [ 635.435707] env[59534]: INFO nova.compute.manager [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] [instance: 58622c1f-054c-454b-a288-f544fe883157] Terminating instance [ 635.438825] env[59534]: DEBUG oslo_concurrency.lockutils [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] Acquiring lock "refresh_cache-58622c1f-054c-454b-a288-f544fe883157" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 635.438977] env[59534]: DEBUG oslo_concurrency.lockutils [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] Acquired lock "refresh_cache-58622c1f-054c-454b-a288-f544fe883157" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 635.439151] env[59534]: DEBUG nova.network.neutron [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] [instance: 58622c1f-054c-454b-a288-f544fe883157] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 635.488358] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f471bc66-2a95-47fb-8ac0-9492f1847e20 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 635.509666] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13bd0835-d359-453a-8f0d-bc8489d54ce6 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 635.514926] env[59534]: DEBUG nova.network.neutron [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] [instance: 58622c1f-054c-454b-a288-f544fe883157] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 635.562205] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c83617a-8696-49ad-a9e5-72670dbbb7fd {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 635.573590] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98e3441c-1f3b-4f89-9751-c523a4b61c50 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 635.594870] env[59534]: DEBUG nova.compute.provider_tree [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 635.600072] env[59534]: DEBUG nova.network.neutron [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 635.604606] env[59534]: DEBUG nova.scheduler.client.report [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 635.615769] env[59534]: DEBUG oslo_concurrency.lockutils [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] Releasing lock "refresh_cache-89c6b8db-b87a-4a05-9fab-72eff91e4fe3" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 635.616506] env[59534]: DEBUG nova.compute.manager [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] Start destroying the instance on the hypervisor. {{(pid=59534) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 635.616757] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] Destroying instance {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 635.617248] env[59534]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-733f771c-3952-46b7-9f25-cf001115b998 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 635.624819] env[59534]: DEBUG oslo_concurrency.lockutils [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.278s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 635.624819] env[59534]: ERROR nova.compute.manager [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 9dbb0ab7-5a0a-4878-850b-2d43c31d668f, please check neutron logs for more information. [ 635.624819] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] Traceback (most recent call last): [ 635.624819] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 635.624819] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] self.driver.spawn(context, instance, image_meta, [ 635.624819] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 635.624819] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] self._vmops.spawn(context, instance, image_meta, injected_files, [ 635.624819] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 635.624819] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] vm_ref = self.build_virtual_machine(instance, [ 635.625186] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 635.625186] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] vif_infos = vmwarevif.get_vif_info(self._session, [ 635.625186] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 635.625186] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] for vif in network_info: [ 635.625186] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 635.625186] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] return self._sync_wrapper(fn, *args, **kwargs) [ 635.625186] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 635.625186] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] self.wait() [ 635.625186] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 635.625186] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] self[:] = self._gt.wait() [ 635.625186] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 635.625186] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] return self._exit_event.wait() [ 635.625186] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 635.625186] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] result = hub.switch() [ 635.625527] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 635.625527] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] return self.greenlet.switch() [ 635.625527] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 635.625527] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] result = function(*args, **kwargs) [ 635.625527] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 635.625527] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] return func(*args, **kwargs) [ 635.625527] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 635.625527] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] raise e [ 635.625527] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 635.625527] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] nwinfo = self.network_api.allocate_for_instance( [ 635.625527] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 635.625527] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] created_port_ids = self._update_ports_for_instance( [ 635.625527] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 635.625908] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] with excutils.save_and_reraise_exception(): [ 635.625908] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 635.625908] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] self.force_reraise() [ 635.625908] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 635.625908] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] raise self.value [ 635.625908] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 635.625908] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] updated_port = self._update_port( [ 635.625908] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 635.625908] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] _ensure_no_port_binding_failure(port) [ 635.625908] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 635.625908] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] raise exception.PortBindingFailed(port_id=port['id']) [ 635.625908] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] nova.exception.PortBindingFailed: Binding failed for port 9dbb0ab7-5a0a-4878-850b-2d43c31d668f, please check neutron logs for more information. [ 635.625908] env[59534]: ERROR nova.compute.manager [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] [ 635.626250] env[59534]: DEBUG nova.compute.utils [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] Binding failed for port 9dbb0ab7-5a0a-4878-850b-2d43c31d668f, please check neutron logs for more information. {{(pid=59534) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 635.629260] env[59534]: DEBUG nova.compute.manager [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] Build of instance 62cecc37-ce9f-42f6-8be2-efa724e94916 was re-scheduled: Binding failed for port 9dbb0ab7-5a0a-4878-850b-2d43c31d668f, please check neutron logs for more information. {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 635.629260] env[59534]: DEBUG nova.compute.manager [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] Unplugging VIFs for instance {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 635.629260] env[59534]: DEBUG oslo_concurrency.lockutils [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] Acquiring lock "refresh_cache-62cecc37-ce9f-42f6-8be2-efa724e94916" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 635.629260] env[59534]: DEBUG oslo_concurrency.lockutils [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] Acquired lock "refresh_cache-62cecc37-ce9f-42f6-8be2-efa724e94916" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 635.629482] env[59534]: DEBUG nova.network.neutron [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 635.632843] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41d39dc1-0cd4-4709-a845-1da39ce2afcb {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 635.658715] env[59534]: WARNING nova.virt.vmwareapi.vmops [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 89c6b8db-b87a-4a05-9fab-72eff91e4fe3 could not be found. [ 635.658949] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] Instance destroyed {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 635.659135] env[59534]: INFO nova.compute.manager [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] Took 0.04 seconds to destroy the instance on the hypervisor. [ 635.659370] env[59534]: DEBUG oslo.service.loopingcall [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59534) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 635.659567] env[59534]: DEBUG nova.compute.manager [-] [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 635.659663] env[59534]: DEBUG nova.network.neutron [-] [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 635.669921] env[59534]: DEBUG nova.network.neutron [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 635.708316] env[59534]: DEBUG nova.network.neutron [-] [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 635.716524] env[59534]: DEBUG nova.network.neutron [-] [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 635.730018] env[59534]: INFO nova.compute.manager [-] [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] Took 0.07 seconds to deallocate network for instance. [ 635.731948] env[59534]: DEBUG nova.compute.claims [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] Aborting claim: {{(pid=59534) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 635.732146] env[59534]: DEBUG oslo_concurrency.lockutils [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 635.732644] env[59534]: DEBUG oslo_concurrency.lockutils [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 635.841364] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ed59223-9ba6-4cad-ae49-f5f5722e0fff {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 635.849677] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-64b1859a-0f67-49d0-b69d-f90580a2958f {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 635.882897] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-75703d06-990a-41e0-bb6e-6704bb52946c {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 635.891426] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b26460d-0072-49b4-bf96-7c7ca0c38627 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 635.906235] env[59534]: DEBUG nova.compute.provider_tree [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 635.917644] env[59534]: DEBUG nova.scheduler.client.report [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 635.936354] env[59534]: DEBUG oslo_concurrency.lockutils [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.204s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 635.937164] env[59534]: ERROR nova.compute.manager [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 9290e007-9d01-4b34-a972-ef5cbd7ff2c7, please check neutron logs for more information. [ 635.937164] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] Traceback (most recent call last): [ 635.937164] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 635.937164] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] self.driver.spawn(context, instance, image_meta, [ 635.937164] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 635.937164] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 635.937164] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 635.937164] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] vm_ref = self.build_virtual_machine(instance, [ 635.937164] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 635.937164] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] vif_infos = vmwarevif.get_vif_info(self._session, [ 635.937164] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 635.937697] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] for vif in network_info: [ 635.937697] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 635.937697] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] return self._sync_wrapper(fn, *args, **kwargs) [ 635.937697] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 635.937697] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] self.wait() [ 635.937697] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 635.937697] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] self[:] = self._gt.wait() [ 635.937697] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 635.937697] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] return self._exit_event.wait() [ 635.937697] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 635.937697] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] result = hub.switch() [ 635.937697] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 635.937697] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] return self.greenlet.switch() [ 635.937697] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 635.938236] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] result = function(*args, **kwargs) [ 635.938236] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 635.938236] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] return func(*args, **kwargs) [ 635.938236] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 635.938236] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] raise e [ 635.938236] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 635.938236] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] nwinfo = self.network_api.allocate_for_instance( [ 635.938236] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 635.938236] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] created_port_ids = self._update_ports_for_instance( [ 635.938236] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 635.938236] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] with excutils.save_and_reraise_exception(): [ 635.938236] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 635.938236] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] self.force_reraise() [ 635.939152] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 635.939152] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] raise self.value [ 635.939152] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 635.939152] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] updated_port = self._update_port( [ 635.939152] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 635.939152] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] _ensure_no_port_binding_failure(port) [ 635.939152] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 635.939152] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] raise exception.PortBindingFailed(port_id=port['id']) [ 635.939152] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] nova.exception.PortBindingFailed: Binding failed for port 9290e007-9d01-4b34-a972-ef5cbd7ff2c7, please check neutron logs for more information. [ 635.939152] env[59534]: ERROR nova.compute.manager [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] [ 635.939758] env[59534]: DEBUG nova.compute.utils [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] Binding failed for port 9290e007-9d01-4b34-a972-ef5cbd7ff2c7, please check neutron logs for more information. {{(pid=59534) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 635.939758] env[59534]: DEBUG nova.compute.manager [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] Build of instance 89c6b8db-b87a-4a05-9fab-72eff91e4fe3 was re-scheduled: Binding failed for port 9290e007-9d01-4b34-a972-ef5cbd7ff2c7, please check neutron logs for more information. {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 635.939853] env[59534]: DEBUG nova.compute.manager [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] Unplugging VIFs for instance {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 635.940051] env[59534]: DEBUG oslo_concurrency.lockutils [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] Acquiring lock "refresh_cache-89c6b8db-b87a-4a05-9fab-72eff91e4fe3" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 635.940183] env[59534]: DEBUG oslo_concurrency.lockutils [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] Acquired lock "refresh_cache-89c6b8db-b87a-4a05-9fab-72eff91e4fe3" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 635.940492] env[59534]: DEBUG nova.network.neutron [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 635.947756] env[59534]: DEBUG nova.network.neutron [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 635.958338] env[59534]: DEBUG oslo_concurrency.lockutils [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] Releasing lock "refresh_cache-62cecc37-ce9f-42f6-8be2-efa724e94916" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 635.958572] env[59534]: DEBUG nova.compute.manager [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 635.958730] env[59534]: DEBUG nova.compute.manager [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 635.958890] env[59534]: DEBUG nova.network.neutron [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 636.019056] env[59534]: DEBUG nova.network.neutron [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 636.023499] env[59534]: DEBUG nova.network.neutron [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 636.034948] env[59534]: DEBUG nova.network.neutron [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 636.043973] env[59534]: INFO nova.compute.manager [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] [instance: 62cecc37-ce9f-42f6-8be2-efa724e94916] Took 0.08 seconds to deallocate network for instance. [ 636.047107] env[59534]: DEBUG nova.network.neutron [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] [instance: 58622c1f-054c-454b-a288-f544fe883157] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 636.057747] env[59534]: DEBUG oslo_concurrency.lockutils [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] Releasing lock "refresh_cache-58622c1f-054c-454b-a288-f544fe883157" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 636.058250] env[59534]: DEBUG nova.compute.manager [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] [instance: 58622c1f-054c-454b-a288-f544fe883157] Start destroying the instance on the hypervisor. {{(pid=59534) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 636.058348] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] [instance: 58622c1f-054c-454b-a288-f544fe883157] Destroying instance {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 636.058834] env[59534]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-3976d80a-0a8a-42c9-b120-0a96e7d07e9c {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 636.072289] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e24b604b-a318-4d9e-9104-91581c4b3d7d {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 636.107949] env[59534]: WARNING nova.virt.vmwareapi.vmops [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] [instance: 58622c1f-054c-454b-a288-f544fe883157] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 58622c1f-054c-454b-a288-f544fe883157 could not be found. [ 636.108182] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] [instance: 58622c1f-054c-454b-a288-f544fe883157] Instance destroyed {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 636.108384] env[59534]: INFO nova.compute.manager [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] [instance: 58622c1f-054c-454b-a288-f544fe883157] Took 0.05 seconds to destroy the instance on the hypervisor. [ 636.108582] env[59534]: DEBUG oslo.service.loopingcall [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59534) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 636.108791] env[59534]: DEBUG nova.compute.manager [-] [instance: 58622c1f-054c-454b-a288-f544fe883157] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 636.108884] env[59534]: DEBUG nova.network.neutron [-] [instance: 58622c1f-054c-454b-a288-f544fe883157] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 636.149431] env[59534]: INFO nova.scheduler.client.report [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] Deleted allocations for instance 62cecc37-ce9f-42f6-8be2-efa724e94916 [ 636.168059] env[59534]: DEBUG nova.network.neutron [-] [instance: 58622c1f-054c-454b-a288-f544fe883157] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 636.173017] env[59534]: DEBUG oslo_concurrency.lockutils [None req-62ee8e35-60b3-423b-be73-b8bea77860df tempest-ServersAdminNegativeTestJSON-259660287 tempest-ServersAdminNegativeTestJSON-259660287-project-member] Lock "62cecc37-ce9f-42f6-8be2-efa724e94916" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 13.937s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 636.182677] env[59534]: DEBUG nova.network.neutron [-] [instance: 58622c1f-054c-454b-a288-f544fe883157] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 636.191229] env[59534]: INFO nova.compute.manager [-] [instance: 58622c1f-054c-454b-a288-f544fe883157] Took 0.08 seconds to deallocate network for instance. [ 636.196142] env[59534]: DEBUG nova.compute.claims [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] [instance: 58622c1f-054c-454b-a288-f544fe883157] Aborting claim: {{(pid=59534) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 636.196253] env[59534]: DEBUG oslo_concurrency.lockutils [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 636.196401] env[59534]: DEBUG oslo_concurrency.lockutils [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 636.293743] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3a88ef9-2564-4c7d-ad76-1992e332d888 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 636.302168] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f13f46d-d842-48cc-89ab-ca8cf41f9b21 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 636.333117] env[59534]: DEBUG nova.network.neutron [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 636.334661] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38353c41-51b7-4e21-9860-2c41c29f6591 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 636.343491] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c958df3-d5d5-41c0-bfbc-95b482bd54c9 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 636.349794] env[59534]: DEBUG oslo_concurrency.lockutils [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] Releasing lock "refresh_cache-89c6b8db-b87a-4a05-9fab-72eff91e4fe3" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 636.350027] env[59534]: DEBUG nova.compute.manager [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 636.350214] env[59534]: DEBUG nova.compute.manager [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 636.350367] env[59534]: DEBUG nova.network.neutron [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 636.360926] env[59534]: DEBUG nova.compute.provider_tree [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 636.369651] env[59534]: DEBUG nova.scheduler.client.report [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 636.387518] env[59534]: DEBUG oslo_concurrency.lockutils [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.191s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 636.388094] env[59534]: ERROR nova.compute.manager [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] [instance: 58622c1f-054c-454b-a288-f544fe883157] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 87fcf100-d495-4ce5-a821-bc64c3aabd53, please check neutron logs for more information. [ 636.388094] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] Traceback (most recent call last): [ 636.388094] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 636.388094] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] self.driver.spawn(context, instance, image_meta, [ 636.388094] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 636.388094] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] self._vmops.spawn(context, instance, image_meta, injected_files, [ 636.388094] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 636.388094] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] vm_ref = self.build_virtual_machine(instance, [ 636.388094] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 636.388094] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] vif_infos = vmwarevif.get_vif_info(self._session, [ 636.388094] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 636.388741] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] for vif in network_info: [ 636.388741] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 636.388741] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] return self._sync_wrapper(fn, *args, **kwargs) [ 636.388741] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 636.388741] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] self.wait() [ 636.388741] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 636.388741] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] self[:] = self._gt.wait() [ 636.388741] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 636.388741] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] return self._exit_event.wait() [ 636.388741] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 636.388741] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] result = hub.switch() [ 636.388741] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 636.388741] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] return self.greenlet.switch() [ 636.388741] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 636.389351] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] result = function(*args, **kwargs) [ 636.389351] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 636.389351] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] return func(*args, **kwargs) [ 636.389351] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 636.389351] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] raise e [ 636.389351] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 636.389351] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] nwinfo = self.network_api.allocate_for_instance( [ 636.389351] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 636.389351] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] created_port_ids = self._update_ports_for_instance( [ 636.389351] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 636.389351] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] with excutils.save_and_reraise_exception(): [ 636.389351] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 636.389351] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] self.force_reraise() [ 636.389878] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 636.389878] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] raise self.value [ 636.389878] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 636.389878] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] updated_port = self._update_port( [ 636.389878] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 636.389878] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] _ensure_no_port_binding_failure(port) [ 636.389878] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 636.389878] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] raise exception.PortBindingFailed(port_id=port['id']) [ 636.389878] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] nova.exception.PortBindingFailed: Binding failed for port 87fcf100-d495-4ce5-a821-bc64c3aabd53, please check neutron logs for more information. [ 636.389878] env[59534]: ERROR nova.compute.manager [instance: 58622c1f-054c-454b-a288-f544fe883157] [ 636.389878] env[59534]: DEBUG nova.compute.utils [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] [instance: 58622c1f-054c-454b-a288-f544fe883157] Binding failed for port 87fcf100-d495-4ce5-a821-bc64c3aabd53, please check neutron logs for more information. {{(pid=59534) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 636.390686] env[59534]: DEBUG nova.compute.manager [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] [instance: 58622c1f-054c-454b-a288-f544fe883157] Build of instance 58622c1f-054c-454b-a288-f544fe883157 was re-scheduled: Binding failed for port 87fcf100-d495-4ce5-a821-bc64c3aabd53, please check neutron logs for more information. {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 636.391118] env[59534]: DEBUG nova.compute.manager [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] [instance: 58622c1f-054c-454b-a288-f544fe883157] Unplugging VIFs for instance {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 636.391338] env[59534]: DEBUG oslo_concurrency.lockutils [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] Acquiring lock "refresh_cache-58622c1f-054c-454b-a288-f544fe883157" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 636.391520] env[59534]: DEBUG oslo_concurrency.lockutils [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] Acquired lock "refresh_cache-58622c1f-054c-454b-a288-f544fe883157" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 636.391624] env[59534]: DEBUG nova.network.neutron [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] [instance: 58622c1f-054c-454b-a288-f544fe883157] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 636.439551] env[59534]: DEBUG nova.network.neutron [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] [instance: 58622c1f-054c-454b-a288-f544fe883157] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 636.480133] env[59534]: DEBUG nova.network.neutron [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 636.488205] env[59534]: DEBUG nova.network.neutron [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 636.502092] env[59534]: INFO nova.compute.manager [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] [instance: 89c6b8db-b87a-4a05-9fab-72eff91e4fe3] Took 0.15 seconds to deallocate network for instance. [ 636.593381] env[59534]: INFO nova.scheduler.client.report [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] Deleted allocations for instance 89c6b8db-b87a-4a05-9fab-72eff91e4fe3 [ 636.613823] env[59534]: DEBUG oslo_concurrency.lockutils [None req-bb040803-b257-4d0a-b376-2fbcaca148fd tempest-ServerRescueNegativeTestJSON-1101508799 tempest-ServerRescueNegativeTestJSON-1101508799-project-member] Lock "89c6b8db-b87a-4a05-9fab-72eff91e4fe3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 12.735s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 636.771297] env[59534]: DEBUG nova.network.neutron [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] [instance: 58622c1f-054c-454b-a288-f544fe883157] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 636.800146] env[59534]: DEBUG oslo_concurrency.lockutils [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] Releasing lock "refresh_cache-58622c1f-054c-454b-a288-f544fe883157" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 636.800456] env[59534]: DEBUG nova.compute.manager [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 636.800646] env[59534]: DEBUG nova.compute.manager [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] [instance: 58622c1f-054c-454b-a288-f544fe883157] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 636.800806] env[59534]: DEBUG nova.network.neutron [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] [instance: 58622c1f-054c-454b-a288-f544fe883157] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 636.852043] env[59534]: DEBUG nova.network.neutron [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] [instance: 58622c1f-054c-454b-a288-f544fe883157] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 636.862988] env[59534]: DEBUG nova.network.neutron [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] [instance: 58622c1f-054c-454b-a288-f544fe883157] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 636.878402] env[59534]: INFO nova.compute.manager [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] [instance: 58622c1f-054c-454b-a288-f544fe883157] Took 0.08 seconds to deallocate network for instance. [ 637.004478] env[59534]: INFO nova.scheduler.client.report [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] Deleted allocations for instance 58622c1f-054c-454b-a288-f544fe883157 [ 637.033557] env[59534]: DEBUG oslo_concurrency.lockutils [None req-cff2793b-c043-4043-ab00-b6207da6a7b2 tempest-AttachInterfacesTestJSON-852401293 tempest-AttachInterfacesTestJSON-852401293-project-member] Lock "58622c1f-054c-454b-a288-f544fe883157" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 10.475s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 679.488115] env[59534]: DEBUG oslo_service.periodic_task [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=59534) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 679.508375] env[59534]: DEBUG oslo_service.periodic_task [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59534) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 679.686934] env[59534]: DEBUG oslo_service.periodic_task [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59534) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 679.687142] env[59534]: DEBUG nova.compute.manager [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Starting heal instance info cache {{(pid=59534) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 679.687258] env[59534]: DEBUG nova.compute.manager [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Rebuilding the list of instances to heal {{(pid=59534) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 679.698609] env[59534]: DEBUG nova.compute.manager [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Didn't find any instances for network info cache update. {{(pid=59534) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 679.698817] env[59534]: DEBUG oslo_service.periodic_task [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59534) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 679.698969] env[59534]: DEBUG oslo_service.periodic_task [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59534) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 679.699133] env[59534]: DEBUG oslo_service.periodic_task [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59534) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 679.699538] env[59534]: DEBUG oslo_service.periodic_task [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59534) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 679.699681] env[59534]: DEBUG nova.compute.manager [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59534) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 680.687731] env[59534]: DEBUG oslo_service.periodic_task [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59534) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 680.687731] env[59534]: DEBUG oslo_service.periodic_task [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59534) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 680.688462] env[59534]: DEBUG oslo_service.periodic_task [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59534) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 680.703817] env[59534]: DEBUG oslo_concurrency.lockutils [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 680.704492] env[59534]: DEBUG oslo_concurrency.lockutils [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 680.705262] env[59534]: DEBUG oslo_concurrency.lockutils [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 680.705941] env[59534]: DEBUG nova.compute.resource_tracker [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59534) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 680.706876] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc6eb26d-ef30-4a12-962a-afd2565d8004 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.718815] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ca3b552-4b0e-4d23-bb98-fdb51c2bdb7a {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.736981] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-378714f2-5896-41de-8aee-f635779cc8f9 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.746354] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ef3262b-8406-40cd-9a21-24723d0d2a36 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.781202] env[59534]: DEBUG nova.compute.resource_tracker [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181496MB free_disk=136GB free_vcpus=48 pci_devices=None {{(pid=59534) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 680.781357] env[59534]: DEBUG oslo_concurrency.lockutils [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 680.781549] env[59534]: DEBUG oslo_concurrency.lockutils [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 680.840373] env[59534]: DEBUG nova.compute.resource_tracker [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=59534) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 680.843255] env[59534]: DEBUG nova.compute.resource_tracker [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=59534) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 680.861132] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28e1a8f2-6144-4f0c-8825-1b0a8a4b682c {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.877116] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c39b233f-b079-4fbf-8469-acce656fdda8 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.923515] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ebee948-3c04-4127-aa04-8e502642356e {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.934324] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a19d9ecf-8e64-43de-a02b-6897658f357f {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.950858] env[59534]: DEBUG nova.compute.provider_tree [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 680.966842] env[59534]: DEBUG nova.scheduler.client.report [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 680.987107] env[59534]: DEBUG nova.compute.resource_tracker [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59534) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 680.987304] env[59534]: DEBUG oslo_concurrency.lockutils [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.206s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 689.132180] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Acquiring lock "053d549e-b3d6-4498-9261-cfacaf8b43bd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 689.132476] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Lock "053d549e-b3d6-4498-9261-cfacaf8b43bd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 689.143602] env[59534]: DEBUG nova.compute.manager [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] Starting instance... {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 689.208170] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 689.208170] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 689.209285] env[59534]: INFO nova.compute.claims [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 689.292361] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0c021a0-6bfc-4e69-b8b9-969e18ed22dc {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 689.302934] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff421491-3ed4-414c-a1af-d30f2d45368a {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 689.340650] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03d144ba-f88f-4d06-a318-c726ab21662c {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 689.349844] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85fe47f4-8a41-4c46-bc08-11584da73e85 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 689.365799] env[59534]: DEBUG nova.compute.provider_tree [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 689.379587] env[59534]: DEBUG nova.scheduler.client.report [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 689.396920] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.189s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 689.397301] env[59534]: DEBUG nova.compute.manager [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] Start building networks asynchronously for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 689.441271] env[59534]: DEBUG nova.compute.utils [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Using /dev/sd instead of None {{(pid=59534) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 689.441903] env[59534]: DEBUG nova.compute.manager [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] Not allocating networking since 'none' was specified. {{(pid=59534) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 689.455883] env[59534]: DEBUG nova.compute.manager [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] Start building block device mappings for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 689.530073] env[59534]: DEBUG nova.compute.manager [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] Start spawning the instance on the hypervisor. {{(pid=59534) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 689.557402] env[59534]: DEBUG nova.virt.hardware [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T19:44:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T19:44:03Z,direct_url=,disk_format='vmdk',id=ca8542a2-3ba7-4624-b2be-cd49a340ac21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='31a329da9a7b4c98a2734a8492bf7dec',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T19:44:04Z,virtual_size=,visibility=), allow threads: False {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 689.557814] env[59534]: DEBUG nova.virt.hardware [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Flavor limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 689.557814] env[59534]: DEBUG nova.virt.hardware [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Image limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 689.557958] env[59534]: DEBUG nova.virt.hardware [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Flavor pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 689.558260] env[59534]: DEBUG nova.virt.hardware [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Image pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 689.558260] env[59534]: DEBUG nova.virt.hardware [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 689.558434] env[59534]: DEBUG nova.virt.hardware [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 689.558706] env[59534]: DEBUG nova.virt.hardware [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 689.558784] env[59534]: DEBUG nova.virt.hardware [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Got 1 possible topologies {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 689.559254] env[59534]: DEBUG nova.virt.hardware [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 689.559254] env[59534]: DEBUG nova.virt.hardware [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 689.560031] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a2eb80f-9202-4500-ae6a-34644dbc6b44 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 689.568756] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e76ea11-86c1-424f-b0a0-efd47f5b4903 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 689.584809] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] Instance VIF info [] {{(pid=59534) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 689.593909] env[59534]: DEBUG nova.virt.vmwareapi.vm_util [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Creating folder: OpenStack. Parent ref: group-v4. {{(pid=59534) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 689.594330] env[59534]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1334ab6d-e3e4-45ce-8cdb-142037b4cba1 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 689.607626] env[59534]: WARNING suds.client [-] Web service reported a SOAP processing fault using an unexpected HTTP status code 200. Reporting as an internal server error. [ 689.607824] env[59534]: DEBUG oslo_vmware.api [-] Fault list: [DuplicateName] {{(pid=59534) _invoke_api /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:337}} [ 689.608157] env[59534]: DEBUG nova.virt.vmwareapi.vm_util [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Folder already exists: OpenStack. Parent ref: group-v4. {{(pid=59534) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1599}} [ 689.608342] env[59534]: DEBUG nova.virt.vmwareapi.vm_util [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Creating folder: Project (1d356250256a467bbaf8c426158abae7). Parent ref: group-v280247. {{(pid=59534) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 689.608568] env[59534]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c113899c-62af-4273-ae14-2e0b499f863b {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 689.617848] env[59534]: INFO nova.virt.vmwareapi.vm_util [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Created folder: Project (1d356250256a467bbaf8c426158abae7) in parent group-v280247. [ 689.618049] env[59534]: DEBUG nova.virt.vmwareapi.vm_util [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Creating folder: Instances. Parent ref: group-v280252. {{(pid=59534) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 689.618273] env[59534]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b60b005b-5740-4db1-a2c8-9f2b3b536c75 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 689.627214] env[59534]: INFO nova.virt.vmwareapi.vm_util [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Created folder: Instances in parent group-v280252. [ 689.627214] env[59534]: DEBUG oslo.service.loopingcall [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59534) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 689.627214] env[59534]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] Creating VM on the ESX host {{(pid=59534) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 689.627379] env[59534]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-bc5366a6-7d52-4adb-9dfb-9e2a2b4ca46d {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 689.644027] env[59534]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 689.644027] env[59534]: value = "task-1308568" [ 689.644027] env[59534]: _type = "Task" [ 689.644027] env[59534]: } to complete. {{(pid=59534) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 689.659531] env[59534]: DEBUG oslo_vmware.api [-] Task: {'id': task-1308568, 'name': CreateVM_Task} progress is 0%. {{(pid=59534) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 690.155396] env[59534]: DEBUG oslo_vmware.api [-] Task: {'id': task-1308568, 'name': CreateVM_Task, 'duration_secs': 0.257515} completed successfully. {{(pid=59534) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 690.155656] env[59534]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] Created VM on the ESX host {{(pid=59534) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 690.157142] env[59534]: DEBUG oslo_vmware.service [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-877cb4d0-5ebe-4b32-9db3-c6c3c65fa3f4 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 690.164540] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/ca8542a2-3ba7-4624-b2be-cd49a340ac21" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 690.164744] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Acquired lock "[datastore1] devstack-image-cache_base/ca8542a2-3ba7-4624-b2be-cd49a340ac21" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 690.165508] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/ca8542a2-3ba7-4624-b2be-cd49a340ac21" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 690.166729] env[59534]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d6fe3143-f2dc-48d8-a948-c9647162efb3 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 690.171189] env[59534]: DEBUG oslo_vmware.api [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Waiting for the task: (returnval){ [ 690.171189] env[59534]: value = "session[529ab406-b1c9-3c06-ce96-9015eeabf2c3]52d6bbdd-64c5-3628-2671-c5e5d126bc25" [ 690.171189] env[59534]: _type = "Task" [ 690.171189] env[59534]: } to complete. {{(pid=59534) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 690.178706] env[59534]: DEBUG oslo_vmware.api [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Task: {'id': session[529ab406-b1c9-3c06-ce96-9015eeabf2c3]52d6bbdd-64c5-3628-2671-c5e5d126bc25, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59534) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 690.688954] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Releasing lock "[datastore1] devstack-image-cache_base/ca8542a2-3ba7-4624-b2be-cd49a340ac21" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 690.688954] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] Processing image ca8542a2-3ba7-4624-b2be-cd49a340ac21 {{(pid=59534) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 690.688954] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/ca8542a2-3ba7-4624-b2be-cd49a340ac21/ca8542a2-3ba7-4624-b2be-cd49a340ac21.vmdk" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 690.688954] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Acquired lock "[datastore1] devstack-image-cache_base/ca8542a2-3ba7-4624-b2be-cd49a340ac21/ca8542a2-3ba7-4624-b2be-cd49a340ac21.vmdk" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 690.689173] env[59534]: DEBUG nova.virt.vmwareapi.ds_util [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59534) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 690.689173] env[59534]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6a9d7064-8416-4f26-b1da-e6e174eeb934 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 690.699913] env[59534]: DEBUG nova.virt.vmwareapi.ds_util [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59534) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 690.700994] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59534) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 690.703119] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-deb6d99c-caee-4a59-94c7-2f8cf897c232 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 690.710644] env[59534]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a12edbcf-66fe-4690-a72b-9799c3521821 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 690.716202] env[59534]: DEBUG oslo_vmware.api [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Waiting for the task: (returnval){ [ 690.716202] env[59534]: value = "session[529ab406-b1c9-3c06-ce96-9015eeabf2c3]52388551-784a-1271-7a19-90deee013d1e" [ 690.716202] env[59534]: _type = "Task" [ 690.716202] env[59534]: } to complete. {{(pid=59534) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 690.726694] env[59534]: DEBUG oslo_vmware.api [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Task: {'id': session[529ab406-b1c9-3c06-ce96-9015eeabf2c3]52388551-784a-1271-7a19-90deee013d1e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59534) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 691.230964] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] Preparing fetch location {{(pid=59534) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 691.230964] env[59534]: DEBUG nova.virt.vmwareapi.ds_util [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Creating directory with path [datastore1] vmware_temp/31d0921e-7f23-4b18-a5c3-60a9191494cd/ca8542a2-3ba7-4624-b2be-cd49a340ac21 {{(pid=59534) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 691.230964] env[59534]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4b5c83c9-b652-4adf-a868-531a7cdc849b {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 691.403367] env[59534]: DEBUG oslo_concurrency.lockutils [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Acquiring lock "7552a136-0a44-43a0-909a-3495eddaa4c9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 691.404545] env[59534]: DEBUG oslo_concurrency.lockutils [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Lock "7552a136-0a44-43a0-909a-3495eddaa4c9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 691.420826] env[59534]: DEBUG nova.compute.manager [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Starting instance... {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 691.488247] env[59534]: DEBUG oslo_concurrency.lockutils [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 691.488494] env[59534]: DEBUG oslo_concurrency.lockutils [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 691.489944] env[59534]: INFO nova.compute.claims [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 691.608891] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-def00c8f-8495-4bef-856d-ca6aa65a8e0b {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 691.624597] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-332dddbd-7866-4506-92b1-cf629741580c {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 691.657348] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e727ee51-fac7-4a59-a859-838e22167055 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 691.665346] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27a70693-7c77-4874-8817-a12af0d74ec5 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 691.679590] env[59534]: DEBUG nova.compute.provider_tree [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 691.691280] env[59534]: DEBUG nova.scheduler.client.report [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 691.706480] env[59534]: DEBUG oslo_concurrency.lockutils [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.218s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 691.706943] env[59534]: DEBUG nova.compute.manager [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Start building networks asynchronously for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 691.716677] env[59534]: DEBUG nova.virt.vmwareapi.ds_util [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Created directory with path [datastore1] vmware_temp/31d0921e-7f23-4b18-a5c3-60a9191494cd/ca8542a2-3ba7-4624-b2be-cd49a340ac21 {{(pid=59534) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 691.716851] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] Fetch image to [datastore1] vmware_temp/31d0921e-7f23-4b18-a5c3-60a9191494cd/ca8542a2-3ba7-4624-b2be-cd49a340ac21/tmp-sparse.vmdk {{(pid=59534) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 691.717013] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] Downloading image file data ca8542a2-3ba7-4624-b2be-cd49a340ac21 to [datastore1] vmware_temp/31d0921e-7f23-4b18-a5c3-60a9191494cd/ca8542a2-3ba7-4624-b2be-cd49a340ac21/tmp-sparse.vmdk on the data store datastore1 {{(pid=59534) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 691.717795] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f485156-e507-4cb7-9ad3-3b2257f7ec6c {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 691.725717] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c8e0c49-3da9-4ebb-ae9c-537dcf34cdcd {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 691.735193] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba2c9497-83f3-48b5-b00a-585df76303cd {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 691.740651] env[59534]: DEBUG nova.compute.utils [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Using /dev/sd instead of None {{(pid=59534) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 691.742255] env[59534]: DEBUG nova.compute.manager [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Allocating IP information in the background. {{(pid=59534) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 691.742327] env[59534]: DEBUG nova.network.neutron [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] allocate_for_instance() {{(pid=59534) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 691.774378] env[59534]: DEBUG nova.compute.manager [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Start building block device mappings for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 691.777361] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07ccd4fb-a903-446c-bcdd-e2a126fd265c {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 691.784256] env[59534]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-97fb4732-e289-401e-8f8a-2d0780d99e73 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 691.819540] env[59534]: DEBUG nova.virt.vmwareapi.images [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] Downloading image file data ca8542a2-3ba7-4624-b2be-cd49a340ac21 to the data store datastore1 {{(pid=59534) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 691.844918] env[59534]: DEBUG nova.compute.manager [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Start spawning the instance on the hypervisor. {{(pid=59534) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 691.876228] env[59534]: DEBUG nova.virt.hardware [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T19:44:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T19:44:03Z,direct_url=,disk_format='vmdk',id=ca8542a2-3ba7-4624-b2be-cd49a340ac21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='31a329da9a7b4c98a2734a8492bf7dec',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T19:44:04Z,virtual_size=,visibility=), allow threads: False {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 691.876462] env[59534]: DEBUG nova.virt.hardware [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Flavor limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 691.876615] env[59534]: DEBUG nova.virt.hardware [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Image limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 691.876789] env[59534]: DEBUG nova.virt.hardware [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Flavor pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 691.876931] env[59534]: DEBUG nova.virt.hardware [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Image pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 691.877185] env[59534]: DEBUG nova.virt.hardware [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 691.877421] env[59534]: DEBUG nova.virt.hardware [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 691.877824] env[59534]: DEBUG nova.virt.hardware [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 691.878019] env[59534]: DEBUG nova.virt.hardware [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Got 1 possible topologies {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 691.878187] env[59534]: DEBUG nova.virt.hardware [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 691.878355] env[59534]: DEBUG nova.virt.hardware [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 691.879221] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b877757-7ab4-4867-a26e-9307b8a4bbff {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 691.883414] env[59534]: DEBUG oslo_vmware.rw_handles [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/31d0921e-7f23-4b18-a5c3-60a9191494cd/ca8542a2-3ba7-4624-b2be-cd49a340ac21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59534) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 691.942963] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74603260-9373-473a-815f-7e9fd16b029d {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 691.951152] env[59534]: DEBUG oslo_vmware.rw_handles [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Completed reading data from the image iterator. {{(pid=59534) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 691.951262] env[59534]: DEBUG oslo_vmware.rw_handles [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/31d0921e-7f23-4b18-a5c3-60a9191494cd/ca8542a2-3ba7-4624-b2be-cd49a340ac21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59534) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 692.519172] env[59534]: DEBUG nova.policy [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f1de811c49e4475e8bebd9420cb053e3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f07a595f0d54471b9b09e9b1b9b0b5a5', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59534) authorize /opt/stack/nova/nova/policy.py:203}} [ 694.804718] env[59534]: DEBUG oslo_concurrency.lockutils [None req-ac2edf85-7cf7-4806-a423-77c859ddddcc tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Acquiring lock "7552a136-0a44-43a0-909a-3495eddaa4c9" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 696.347869] env[59534]: DEBUG nova.network.neutron [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Successfully created port: 727c2892-1c96-4183-b45a-4602d53bd9be {{(pid=59534) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 697.335224] env[59534]: DEBUG oslo_concurrency.lockutils [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Acquiring lock "fcdac100-e2fe-434e-87ca-0a174ecfb0e6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 697.335224] env[59534]: DEBUG oslo_concurrency.lockutils [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Lock "fcdac100-e2fe-434e-87ca-0a174ecfb0e6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 697.347627] env[59534]: DEBUG nova.compute.manager [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] Starting instance... {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 697.398562] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Acquiring lock "86b1de9b-bfc7-4810-9906-e59e01f11594" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 697.398562] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Lock "86b1de9b-bfc7-4810-9906-e59e01f11594" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 697.411376] env[59534]: DEBUG nova.compute.manager [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] Starting instance... {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 697.424101] env[59534]: DEBUG oslo_concurrency.lockutils [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 697.424101] env[59534]: DEBUG oslo_concurrency.lockutils [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 697.424101] env[59534]: INFO nova.compute.claims [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 697.475685] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 697.564488] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-881d2790-d87e-4dc9-9cd0-992d44799a6f {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.572732] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7060a9bf-9538-4460-a699-c74449b21189 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.605140] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f26562fc-1bb9-43ac-a46d-a0649f4e34c7 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.612697] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73334a9b-cf40-4259-ba24-788513268bde {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.628236] env[59534]: DEBUG nova.compute.provider_tree [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 697.638520] env[59534]: DEBUG nova.scheduler.client.report [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 697.652259] env[59534]: DEBUG oslo_concurrency.lockutils [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.230s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 697.652737] env[59534]: DEBUG nova.compute.manager [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] Start building networks asynchronously for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 697.655144] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.179s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 697.656606] env[59534]: INFO nova.compute.claims [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 697.690525] env[59534]: DEBUG nova.compute.utils [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Using /dev/sd instead of None {{(pid=59534) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 697.693854] env[59534]: DEBUG nova.compute.manager [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] Allocating IP information in the background. {{(pid=59534) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 697.694107] env[59534]: DEBUG nova.network.neutron [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] allocate_for_instance() {{(pid=59534) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 697.706599] env[59534]: DEBUG nova.compute.manager [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] Start building block device mappings for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 697.799889] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7175cc77-8999-4550-8b68-5625a21cdac4 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.808670] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-54bdde36-7ba0-4931-99d1-8faa231d1244 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.844031] env[59534]: DEBUG nova.compute.manager [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] Start spawning the instance on the hypervisor. {{(pid=59534) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 697.846774] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc013313-235c-4dc5-a038-cc3eea456b5b {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.854995] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-312ccdf6-0fe7-4e7d-a6c9-079a043dbd03 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.870745] env[59534]: DEBUG nova.compute.provider_tree [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 697.881636] env[59534]: DEBUG nova.virt.hardware [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T19:44:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T19:44:03Z,direct_url=,disk_format='vmdk',id=ca8542a2-3ba7-4624-b2be-cd49a340ac21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='31a329da9a7b4c98a2734a8492bf7dec',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T19:44:04Z,virtual_size=,visibility=), allow threads: False {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 697.881949] env[59534]: DEBUG nova.virt.hardware [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Flavor limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 697.882836] env[59534]: DEBUG nova.virt.hardware [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Image limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 697.882836] env[59534]: DEBUG nova.virt.hardware [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Flavor pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 697.882836] env[59534]: DEBUG nova.virt.hardware [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Image pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 697.882836] env[59534]: DEBUG nova.virt.hardware [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 697.883055] env[59534]: DEBUG nova.virt.hardware [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 697.883417] env[59534]: DEBUG nova.virt.hardware [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 697.883417] env[59534]: DEBUG nova.virt.hardware [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Got 1 possible topologies {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 697.883528] env[59534]: DEBUG nova.virt.hardware [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 697.883727] env[59534]: DEBUG nova.virt.hardware [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 697.884905] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be218b2f-745e-44ae-a339-ab238a92a527 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.889483] env[59534]: DEBUG nova.scheduler.client.report [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 697.899144] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6316f0f-3c1a-48b3-9b56-6f96d663be8e {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.905447] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.250s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 697.906194] env[59534]: DEBUG nova.compute.manager [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] Start building networks asynchronously for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 697.948457] env[59534]: DEBUG nova.compute.utils [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Using /dev/sd instead of None {{(pid=59534) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 697.948901] env[59534]: DEBUG nova.compute.manager [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] Allocating IP information in the background. {{(pid=59534) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 697.949098] env[59534]: DEBUG nova.network.neutron [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] allocate_for_instance() {{(pid=59534) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 697.959015] env[59534]: DEBUG nova.compute.manager [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] Start building block device mappings for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 698.033733] env[59534]: DEBUG nova.compute.manager [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] Start spawning the instance on the hypervisor. {{(pid=59534) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 698.060402] env[59534]: DEBUG nova.virt.hardware [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T19:44:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T19:44:03Z,direct_url=,disk_format='vmdk',id=ca8542a2-3ba7-4624-b2be-cd49a340ac21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='31a329da9a7b4c98a2734a8492bf7dec',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T19:44:04Z,virtual_size=,visibility=), allow threads: False {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 698.060638] env[59534]: DEBUG nova.virt.hardware [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Flavor limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 698.060791] env[59534]: DEBUG nova.virt.hardware [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Image limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 698.060975] env[59534]: DEBUG nova.virt.hardware [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Flavor pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 698.061161] env[59534]: DEBUG nova.virt.hardware [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Image pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 698.061326] env[59534]: DEBUG nova.virt.hardware [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 698.061532] env[59534]: DEBUG nova.virt.hardware [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 698.061685] env[59534]: DEBUG nova.virt.hardware [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 698.061985] env[59534]: DEBUG nova.virt.hardware [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Got 1 possible topologies {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 698.061985] env[59534]: DEBUG nova.virt.hardware [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 698.062552] env[59534]: DEBUG nova.virt.hardware [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 698.063403] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-712aa836-c7de-40aa-9b60-55c2b60a7677 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 698.075944] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-929c9934-2f72-447c-99e3-cfc666a487fb {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 698.127481] env[59534]: DEBUG nova.policy [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '55181ba3de204d70a5bd1fc4e92ad17c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '61fe8d56a7ce444e87d2e7743b3a961f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59534) authorize /opt/stack/nova/nova/policy.py:203}} [ 698.408312] env[59534]: DEBUG oslo_concurrency.lockutils [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Acquiring lock "11ff4621-5da2-4da3-97bf-2ff6b231233c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 698.408662] env[59534]: DEBUG oslo_concurrency.lockutils [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Lock "11ff4621-5da2-4da3-97bf-2ff6b231233c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 698.420784] env[59534]: DEBUG nova.compute.manager [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] Starting instance... {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 698.480088] env[59534]: DEBUG oslo_concurrency.lockutils [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 698.480338] env[59534]: DEBUG oslo_concurrency.lockutils [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 698.482270] env[59534]: INFO nova.compute.claims [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 698.645852] env[59534]: DEBUG nova.policy [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a5c59c008b2e44389ed9e37eb461a0c5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0c0b17586dcd47b9b08c331a00083433', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59534) authorize /opt/stack/nova/nova/policy.py:203}} [ 698.650313] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-730e3cf9-f139-4d30-b7eb-acbe4b486262 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 698.658319] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6cb89df8-5943-41e5-9bdd-c65988a4340d {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 698.702254] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9411caa2-b8ec-409a-a9d1-5acdd2561f37 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 698.711737] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8793ff15-5780-4f6c-a85c-ca3f7583a4b5 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 698.726903] env[59534]: DEBUG nova.compute.provider_tree [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 698.738594] env[59534]: DEBUG nova.scheduler.client.report [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 698.752608] env[59534]: DEBUG oslo_concurrency.lockutils [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.272s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 698.753191] env[59534]: DEBUG nova.compute.manager [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] Start building networks asynchronously for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 698.793489] env[59534]: DEBUG nova.compute.utils [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Using /dev/sd instead of None {{(pid=59534) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 698.796609] env[59534]: DEBUG nova.compute.manager [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] Allocating IP information in the background. {{(pid=59534) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 698.796609] env[59534]: DEBUG nova.network.neutron [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] allocate_for_instance() {{(pid=59534) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 698.811333] env[59534]: DEBUG nova.compute.manager [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] Start building block device mappings for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 698.901101] env[59534]: DEBUG nova.compute.manager [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] Start spawning the instance on the hypervisor. {{(pid=59534) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 698.933544] env[59534]: DEBUG nova.virt.hardware [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T19:44:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T19:44:03Z,direct_url=,disk_format='vmdk',id=ca8542a2-3ba7-4624-b2be-cd49a340ac21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='31a329da9a7b4c98a2734a8492bf7dec',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T19:44:04Z,virtual_size=,visibility=), allow threads: False {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 698.933782] env[59534]: DEBUG nova.virt.hardware [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Flavor limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 698.933942] env[59534]: DEBUG nova.virt.hardware [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Image limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 698.936849] env[59534]: DEBUG nova.virt.hardware [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Flavor pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 698.936849] env[59534]: DEBUG nova.virt.hardware [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Image pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 698.936849] env[59534]: DEBUG nova.virt.hardware [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 698.936849] env[59534]: DEBUG nova.virt.hardware [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 698.936849] env[59534]: DEBUG nova.virt.hardware [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 698.937172] env[59534]: DEBUG nova.virt.hardware [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Got 1 possible topologies {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 698.937172] env[59534]: DEBUG nova.virt.hardware [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 698.937239] env[59534]: DEBUG nova.virt.hardware [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 698.939429] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5657cd45-c8e4-4e0e-bca2-674744a11037 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 698.949868] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e597dc55-0e87-4ec1-9faa-1363a35fe285 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 699.718473] env[59534]: DEBUG nova.policy [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cee73fe533ad439ab17e537c4bfc7a20', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '21f218c3b73046bda1ac458019183605', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59534) authorize /opt/stack/nova/nova/policy.py:203}} [ 701.242662] env[59534]: DEBUG oslo_concurrency.lockutils [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] Acquiring lock "68f534dd-7119-48f4-85ae-14bfdf68d486" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 701.242662] env[59534]: DEBUG oslo_concurrency.lockutils [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] Lock "68f534dd-7119-48f4-85ae-14bfdf68d486" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 701.256785] env[59534]: DEBUG nova.compute.manager [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] Starting instance... {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 701.316365] env[59534]: DEBUG oslo_concurrency.lockutils [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 701.316761] env[59534]: DEBUG oslo_concurrency.lockutils [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 701.319312] env[59534]: INFO nova.compute.claims [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 701.480012] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5896ead9-66a0-4196-9b88-b12f785973db {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 701.489712] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-424aa705-b6dc-48a9-87c6-197ed7c7b9c7 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 701.530177] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cdd84c6c-b0ed-4b11-bc83-5008171c868c {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 701.537973] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7584c9e2-9246-4ca5-9ef3-7e6b30f0e3d4 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 701.553064] env[59534]: DEBUG nova.compute.provider_tree [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 701.568604] env[59534]: DEBUG nova.scheduler.client.report [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 701.595341] env[59534]: DEBUG oslo_concurrency.lockutils [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.278s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 701.595836] env[59534]: DEBUG nova.compute.manager [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] Start building networks asynchronously for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 701.634432] env[59534]: DEBUG nova.compute.utils [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] Using /dev/sd instead of None {{(pid=59534) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 701.636208] env[59534]: DEBUG nova.compute.manager [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] Allocating IP information in the background. {{(pid=59534) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 701.637471] env[59534]: DEBUG nova.network.neutron [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] allocate_for_instance() {{(pid=59534) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 701.655083] env[59534]: DEBUG nova.compute.manager [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] Start building block device mappings for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 701.726469] env[59534]: DEBUG nova.compute.manager [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] Start spawning the instance on the hypervisor. {{(pid=59534) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 701.751228] env[59534]: DEBUG nova.virt.hardware [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T19:44:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T19:44:03Z,direct_url=,disk_format='vmdk',id=ca8542a2-3ba7-4624-b2be-cd49a340ac21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='31a329da9a7b4c98a2734a8492bf7dec',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T19:44:04Z,virtual_size=,visibility=), allow threads: False {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 701.752376] env[59534]: DEBUG nova.virt.hardware [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] Flavor limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 701.752907] env[59534]: DEBUG nova.virt.hardware [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] Image limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 701.755398] env[59534]: DEBUG nova.virt.hardware [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] Flavor pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 701.755398] env[59534]: DEBUG nova.virt.hardware [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] Image pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 701.755398] env[59534]: DEBUG nova.virt.hardware [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 701.755398] env[59534]: DEBUG nova.virt.hardware [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 701.755398] env[59534]: DEBUG nova.virt.hardware [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 701.755765] env[59534]: DEBUG nova.virt.hardware [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] Got 1 possible topologies {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 701.755765] env[59534]: DEBUG nova.virt.hardware [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 701.755765] env[59534]: DEBUG nova.virt.hardware [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 701.755765] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af9f905d-7d20-4863-8d16-a65ced458408 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 701.765901] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79d1d8c1-6525-49c9-84ef-388c4b0f9074 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 701.821625] env[59534]: DEBUG nova.network.neutron [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] Successfully created port: 938a7aad-7df9-4af1-bd70-4d4ab4f7363b {{(pid=59534) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 701.831119] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Acquiring lock "1b3f51ff-2374-499e-8d60-6a0e5cdd2609" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 701.831259] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Lock "1b3f51ff-2374-499e-8d60-6a0e5cdd2609" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 701.846523] env[59534]: DEBUG nova.compute.manager [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] Starting instance... {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 701.909554] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 701.909794] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 701.911268] env[59534]: INFO nova.compute.claims [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 702.092206] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-747538a0-ddbf-4cd1-96fa-547d7db79ac1 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 702.100353] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b0209dc7-0284-4817-9551-a4a9fed5d092 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 702.135111] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4dfb2ae-bdfa-4945-a440-4dc272fa63c7 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 702.142098] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68864340-be1c-4d72-b2e1-33e987439654 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 702.155659] env[59534]: DEBUG nova.compute.provider_tree [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 702.171020] env[59534]: DEBUG nova.scheduler.client.report [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 702.190340] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.278s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 702.190340] env[59534]: DEBUG nova.compute.manager [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] Start building networks asynchronously for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 702.235578] env[59534]: DEBUG nova.compute.utils [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Using /dev/sd instead of None {{(pid=59534) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 702.236861] env[59534]: DEBUG nova.compute.manager [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] Allocating IP information in the background. {{(pid=59534) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 702.237069] env[59534]: DEBUG nova.network.neutron [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] allocate_for_instance() {{(pid=59534) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 702.253400] env[59534]: DEBUG nova.compute.manager [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] Start building block device mappings for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 702.329931] env[59534]: DEBUG nova.compute.manager [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] Start spawning the instance on the hypervisor. {{(pid=59534) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 702.360096] env[59534]: DEBUG nova.virt.hardware [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T19:44:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T19:44:03Z,direct_url=,disk_format='vmdk',id=ca8542a2-3ba7-4624-b2be-cd49a340ac21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='31a329da9a7b4c98a2734a8492bf7dec',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T19:44:04Z,virtual_size=,visibility=), allow threads: False {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 702.360698] env[59534]: DEBUG nova.virt.hardware [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Flavor limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 702.361074] env[59534]: DEBUG nova.virt.hardware [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Image limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 702.361150] env[59534]: DEBUG nova.virt.hardware [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Flavor pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 702.361459] env[59534]: DEBUG nova.virt.hardware [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Image pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 702.361522] env[59534]: DEBUG nova.virt.hardware [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 702.362321] env[59534]: DEBUG nova.virt.hardware [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 702.362804] env[59534]: DEBUG nova.virt.hardware [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 702.362804] env[59534]: DEBUG nova.virt.hardware [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Got 1 possible topologies {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 702.362973] env[59534]: DEBUG nova.virt.hardware [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 702.363194] env[59534]: DEBUG nova.virt.hardware [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 702.364152] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-caea8131-07c4-4679-a164-45499b3172f6 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 702.374375] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-599345d8-279d-4e0c-ac20-9e7e0c2f8630 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 702.403256] env[59534]: DEBUG nova.network.neutron [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] Successfully created port: e33ba32f-c6a6-4766-af21-edfc90a29ea1 {{(pid=59534) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 702.440455] env[59534]: DEBUG nova.policy [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fa3e1fb90cb24248a19a162d2d3bc7a1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '82d442a1227e4368a2f74c32503387d5', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59534) authorize /opt/stack/nova/nova/policy.py:203}} [ 702.923472] env[59534]: DEBUG nova.policy [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cee73fe533ad439ab17e537c4bfc7a20', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '21f218c3b73046bda1ac458019183605', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59534) authorize /opt/stack/nova/nova/policy.py:203}} [ 703.971723] env[59534]: DEBUG nova.network.neutron [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] Successfully created port: 8b6deb85-5996-4f3a-8207-fca9bf3c99e3 {{(pid=59534) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 705.557565] env[59534]: DEBUG nova.network.neutron [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] Successfully created port: bd89f1d4-a239-4ce1-91f2-766e8f9cf459 {{(pid=59534) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 705.560532] env[59534]: DEBUG nova.network.neutron [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] Successfully created port: b0b18d8c-1198-48bf-9ab0-3de8ad8c08dc {{(pid=59534) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 706.458499] env[59534]: DEBUG nova.network.neutron [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] Successfully created port: 69c955c3-52ad-4fe7-a327-c43a02064bd0 {{(pid=59534) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 707.739701] env[59534]: ERROR nova.compute.manager [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 727c2892-1c96-4183-b45a-4602d53bd9be, please check neutron logs for more information. [ 707.739701] env[59534]: ERROR nova.compute.manager Traceback (most recent call last): [ 707.739701] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 707.739701] env[59534]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 707.739701] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 707.739701] env[59534]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 707.739701] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 707.739701] env[59534]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 707.739701] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 707.739701] env[59534]: ERROR nova.compute.manager self.force_reraise() [ 707.739701] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 707.739701] env[59534]: ERROR nova.compute.manager raise self.value [ 707.739701] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 707.739701] env[59534]: ERROR nova.compute.manager updated_port = self._update_port( [ 707.739701] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 707.739701] env[59534]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 707.740280] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 707.740280] env[59534]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 707.740280] env[59534]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 727c2892-1c96-4183-b45a-4602d53bd9be, please check neutron logs for more information. [ 707.740280] env[59534]: ERROR nova.compute.manager [ 707.740280] env[59534]: Traceback (most recent call last): [ 707.740280] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 707.740280] env[59534]: listener.cb(fileno) [ 707.740280] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 707.740280] env[59534]: result = function(*args, **kwargs) [ 707.740280] env[59534]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 707.740280] env[59534]: return func(*args, **kwargs) [ 707.740280] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 707.740280] env[59534]: raise e [ 707.740280] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 707.740280] env[59534]: nwinfo = self.network_api.allocate_for_instance( [ 707.740280] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 707.740280] env[59534]: created_port_ids = self._update_ports_for_instance( [ 707.740280] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 707.740280] env[59534]: with excutils.save_and_reraise_exception(): [ 707.740280] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 707.740280] env[59534]: self.force_reraise() [ 707.740280] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 707.740280] env[59534]: raise self.value [ 707.740280] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 707.740280] env[59534]: updated_port = self._update_port( [ 707.740280] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 707.740280] env[59534]: _ensure_no_port_binding_failure(port) [ 707.740280] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 707.740280] env[59534]: raise exception.PortBindingFailed(port_id=port['id']) [ 707.741145] env[59534]: nova.exception.PortBindingFailed: Binding failed for port 727c2892-1c96-4183-b45a-4602d53bd9be, please check neutron logs for more information. [ 707.741145] env[59534]: Removing descriptor: 12 [ 707.741145] env[59534]: ERROR nova.compute.manager [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 727c2892-1c96-4183-b45a-4602d53bd9be, please check neutron logs for more information. [ 707.741145] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Traceback (most recent call last): [ 707.741145] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 707.741145] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] yield resources [ 707.741145] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 707.741145] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] self.driver.spawn(context, instance, image_meta, [ 707.741145] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 707.741145] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 707.741145] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 707.741145] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] vm_ref = self.build_virtual_machine(instance, [ 707.741499] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 707.741499] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] vif_infos = vmwarevif.get_vif_info(self._session, [ 707.741499] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 707.741499] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] for vif in network_info: [ 707.741499] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 707.741499] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] return self._sync_wrapper(fn, *args, **kwargs) [ 707.741499] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 707.741499] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] self.wait() [ 707.741499] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 707.741499] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] self[:] = self._gt.wait() [ 707.741499] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 707.741499] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] return self._exit_event.wait() [ 707.741499] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 707.741499] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] result = hub.switch() [ 707.741867] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 707.741867] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] return self.greenlet.switch() [ 707.741867] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 707.741867] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] result = function(*args, **kwargs) [ 707.741867] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 707.741867] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] return func(*args, **kwargs) [ 707.741867] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 707.741867] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] raise e [ 707.741867] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 707.741867] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] nwinfo = self.network_api.allocate_for_instance( [ 707.741867] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 707.741867] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] created_port_ids = self._update_ports_for_instance( [ 707.741867] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 707.742260] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] with excutils.save_and_reraise_exception(): [ 707.742260] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 707.742260] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] self.force_reraise() [ 707.742260] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 707.742260] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] raise self.value [ 707.742260] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 707.742260] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] updated_port = self._update_port( [ 707.742260] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 707.742260] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] _ensure_no_port_binding_failure(port) [ 707.742260] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 707.742260] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] raise exception.PortBindingFailed(port_id=port['id']) [ 707.742260] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] nova.exception.PortBindingFailed: Binding failed for port 727c2892-1c96-4183-b45a-4602d53bd9be, please check neutron logs for more information. [ 707.742260] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] [ 707.742674] env[59534]: INFO nova.compute.manager [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Terminating instance [ 707.744866] env[59534]: DEBUG oslo_concurrency.lockutils [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Acquiring lock "refresh_cache-7552a136-0a44-43a0-909a-3495eddaa4c9" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 707.745274] env[59534]: DEBUG oslo_concurrency.lockutils [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Acquired lock "refresh_cache-7552a136-0a44-43a0-909a-3495eddaa4c9" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 707.745488] env[59534]: DEBUG nova.network.neutron [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 707.810292] env[59534]: DEBUG nova.network.neutron [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 708.652992] env[59534]: DEBUG nova.network.neutron [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 708.670761] env[59534]: DEBUG oslo_concurrency.lockutils [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Releasing lock "refresh_cache-7552a136-0a44-43a0-909a-3495eddaa4c9" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 708.671268] env[59534]: DEBUG nova.compute.manager [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Start destroying the instance on the hypervisor. {{(pid=59534) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 708.671458] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Destroying instance {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 708.672036] env[59534]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-e8c38db5-3b00-44f2-8423-e2aea27e3eb7 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 708.684290] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dac712ae-a52e-4a3b-b519-c75f75af495b {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 708.720758] env[59534]: WARNING nova.virt.vmwareapi.vmops [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 7552a136-0a44-43a0-909a-3495eddaa4c9 could not be found. [ 708.721082] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Instance destroyed {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 708.721284] env[59534]: INFO nova.compute.manager [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Took 0.05 seconds to destroy the instance on the hypervisor. [ 708.721539] env[59534]: DEBUG oslo.service.loopingcall [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59534) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 708.722480] env[59534]: DEBUG nova.network.neutron [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] Successfully created port: f66650b8-a02e-47f8-83b2-52714be6c463 {{(pid=59534) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 708.724862] env[59534]: DEBUG nova.compute.manager [-] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 708.724961] env[59534]: DEBUG nova.network.neutron [-] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 708.823144] env[59534]: DEBUG nova.network.neutron [-] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 708.844697] env[59534]: DEBUG nova.network.neutron [-] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 708.852731] env[59534]: DEBUG nova.compute.manager [req-fedb7b1d-d143-426f-aeaa-73bb8acf95f9 req-607efb17-e025-4fb6-b172-8f2df023e51a service nova] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Received event network-changed-727c2892-1c96-4183-b45a-4602d53bd9be {{(pid=59534) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 708.852731] env[59534]: DEBUG nova.compute.manager [req-fedb7b1d-d143-426f-aeaa-73bb8acf95f9 req-607efb17-e025-4fb6-b172-8f2df023e51a service nova] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Refreshing instance network info cache due to event network-changed-727c2892-1c96-4183-b45a-4602d53bd9be. {{(pid=59534) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 708.852731] env[59534]: DEBUG oslo_concurrency.lockutils [req-fedb7b1d-d143-426f-aeaa-73bb8acf95f9 req-607efb17-e025-4fb6-b172-8f2df023e51a service nova] Acquiring lock "refresh_cache-7552a136-0a44-43a0-909a-3495eddaa4c9" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 708.852731] env[59534]: DEBUG oslo_concurrency.lockutils [req-fedb7b1d-d143-426f-aeaa-73bb8acf95f9 req-607efb17-e025-4fb6-b172-8f2df023e51a service nova] Acquired lock "refresh_cache-7552a136-0a44-43a0-909a-3495eddaa4c9" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 708.852731] env[59534]: DEBUG nova.network.neutron [req-fedb7b1d-d143-426f-aeaa-73bb8acf95f9 req-607efb17-e025-4fb6-b172-8f2df023e51a service nova] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Refreshing network info cache for port 727c2892-1c96-4183-b45a-4602d53bd9be {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 708.858870] env[59534]: INFO nova.compute.manager [-] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Took 0.13 seconds to deallocate network for instance. [ 708.860944] env[59534]: DEBUG nova.compute.claims [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Aborting claim: {{(pid=59534) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 708.861165] env[59534]: DEBUG oslo_concurrency.lockutils [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 708.861397] env[59534]: DEBUG oslo_concurrency.lockutils [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 708.956516] env[59534]: DEBUG nova.network.neutron [req-fedb7b1d-d143-426f-aeaa-73bb8acf95f9 req-607efb17-e025-4fb6-b172-8f2df023e51a service nova] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 709.068231] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-668b188e-46c9-4671-ab7e-a8c45c20716f {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 709.083215] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d615767f-b1a8-4b9f-b88d-f0dbf63286ef {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 709.128923] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d66cab3-02ae-4a5b-a98a-028c9f7f5cde {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 709.137267] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd90af23-857e-4a7c-b115-b7f5f1a1d879 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 709.158411] env[59534]: DEBUG nova.compute.provider_tree [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 709.170177] env[59534]: DEBUG nova.scheduler.client.report [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 709.197439] env[59534]: DEBUG oslo_concurrency.lockutils [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.335s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 709.198076] env[59534]: ERROR nova.compute.manager [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 727c2892-1c96-4183-b45a-4602d53bd9be, please check neutron logs for more information. [ 709.198076] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Traceback (most recent call last): [ 709.198076] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 709.198076] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] self.driver.spawn(context, instance, image_meta, [ 709.198076] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 709.198076] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 709.198076] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 709.198076] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] vm_ref = self.build_virtual_machine(instance, [ 709.198076] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 709.198076] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] vif_infos = vmwarevif.get_vif_info(self._session, [ 709.198076] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 709.198872] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] for vif in network_info: [ 709.198872] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 709.198872] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] return self._sync_wrapper(fn, *args, **kwargs) [ 709.198872] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 709.198872] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] self.wait() [ 709.198872] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 709.198872] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] self[:] = self._gt.wait() [ 709.198872] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 709.198872] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] return self._exit_event.wait() [ 709.198872] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 709.198872] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] result = hub.switch() [ 709.198872] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 709.198872] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] return self.greenlet.switch() [ 709.198872] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 709.199363] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] result = function(*args, **kwargs) [ 709.199363] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 709.199363] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] return func(*args, **kwargs) [ 709.199363] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 709.199363] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] raise e [ 709.199363] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 709.199363] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] nwinfo = self.network_api.allocate_for_instance( [ 709.199363] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 709.199363] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] created_port_ids = self._update_ports_for_instance( [ 709.199363] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 709.199363] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] with excutils.save_and_reraise_exception(): [ 709.199363] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 709.199363] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] self.force_reraise() [ 709.199687] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 709.199687] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] raise self.value [ 709.199687] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 709.199687] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] updated_port = self._update_port( [ 709.199687] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 709.199687] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] _ensure_no_port_binding_failure(port) [ 709.199687] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 709.199687] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] raise exception.PortBindingFailed(port_id=port['id']) [ 709.199687] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] nova.exception.PortBindingFailed: Binding failed for port 727c2892-1c96-4183-b45a-4602d53bd9be, please check neutron logs for more information. [ 709.199687] env[59534]: ERROR nova.compute.manager [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] [ 709.199687] env[59534]: DEBUG nova.compute.utils [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Binding failed for port 727c2892-1c96-4183-b45a-4602d53bd9be, please check neutron logs for more information. {{(pid=59534) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 709.202774] env[59534]: DEBUG nova.compute.manager [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Build of instance 7552a136-0a44-43a0-909a-3495eddaa4c9 was re-scheduled: Binding failed for port 727c2892-1c96-4183-b45a-4602d53bd9be, please check neutron logs for more information. {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 709.205077] env[59534]: DEBUG nova.compute.manager [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Unplugging VIFs for instance {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 709.205077] env[59534]: DEBUG oslo_concurrency.lockutils [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Acquiring lock "refresh_cache-7552a136-0a44-43a0-909a-3495eddaa4c9" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 709.384768] env[59534]: DEBUG oslo_concurrency.lockutils [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Acquiring lock "5a549ffd-3cc3-4723-bfe6-510dbef0fea7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 709.385033] env[59534]: DEBUG oslo_concurrency.lockutils [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Lock "5a549ffd-3cc3-4723-bfe6-510dbef0fea7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 709.395987] env[59534]: DEBUG nova.compute.manager [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] Starting instance... {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 709.453519] env[59534]: DEBUG oslo_concurrency.lockutils [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 709.453661] env[59534]: DEBUG oslo_concurrency.lockutils [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 709.455146] env[59534]: INFO nova.compute.claims [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 709.654860] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-755c3357-a1bd-42c6-a30c-0208335f4a80 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 709.662818] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-937cd2fa-9d14-4ab9-a3b6-fff519b47839 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 709.694992] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4e26e16-d6eb-4938-b7d2-430fb94622fb {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 709.702841] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53b2e626-fd0b-4b3b-90cc-d3b9341127f8 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 709.717103] env[59534]: DEBUG nova.compute.provider_tree [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 709.727096] env[59534]: DEBUG nova.scheduler.client.report [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 709.744247] env[59534]: DEBUG oslo_concurrency.lockutils [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.291s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 709.744730] env[59534]: DEBUG nova.compute.manager [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] Start building networks asynchronously for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 709.788495] env[59534]: DEBUG nova.compute.utils [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Using /dev/sd instead of None {{(pid=59534) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 709.789969] env[59534]: DEBUG nova.compute.manager [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] Not allocating networking since 'none' was specified. {{(pid=59534) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 709.806530] env[59534]: DEBUG nova.compute.manager [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] Start building block device mappings for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 709.875975] env[59534]: DEBUG nova.compute.manager [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] Start spawning the instance on the hypervisor. {{(pid=59534) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 709.903303] env[59534]: DEBUG nova.virt.hardware [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T19:44:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T19:44:03Z,direct_url=,disk_format='vmdk',id=ca8542a2-3ba7-4624-b2be-cd49a340ac21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='31a329da9a7b4c98a2734a8492bf7dec',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T19:44:04Z,virtual_size=,visibility=), allow threads: False {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 709.903516] env[59534]: DEBUG nova.virt.hardware [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Flavor limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 709.905097] env[59534]: DEBUG nova.virt.hardware [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Image limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 709.905097] env[59534]: DEBUG nova.virt.hardware [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Flavor pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 709.905307] env[59534]: DEBUG nova.virt.hardware [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Image pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 709.906089] env[59534]: DEBUG nova.virt.hardware [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 709.906089] env[59534]: DEBUG nova.virt.hardware [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 709.906089] env[59534]: DEBUG nova.virt.hardware [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 709.906089] env[59534]: DEBUG nova.virt.hardware [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Got 1 possible topologies {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 709.906733] env[59534]: DEBUG nova.virt.hardware [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 709.906733] env[59534]: DEBUG nova.virt.hardware [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 709.907450] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-110da4fa-5b80-4e27-a41c-865508e263b9 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 709.918756] env[59534]: DEBUG nova.network.neutron [req-fedb7b1d-d143-426f-aeaa-73bb8acf95f9 req-607efb17-e025-4fb6-b172-8f2df023e51a service nova] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 709.923268] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea87d48d-9932-4037-98b8-074e95e7ccf0 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 709.941321] env[59534]: DEBUG oslo_concurrency.lockutils [req-fedb7b1d-d143-426f-aeaa-73bb8acf95f9 req-607efb17-e025-4fb6-b172-8f2df023e51a service nova] Releasing lock "refresh_cache-7552a136-0a44-43a0-909a-3495eddaa4c9" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 709.942404] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] Instance VIF info [] {{(pid=59534) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 709.951250] env[59534]: DEBUG nova.virt.vmwareapi.vm_util [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Creating folder: Project (353be627d6be43b0a2cb6bc48c77b78f). Parent ref: group-v280247. {{(pid=59534) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 709.951250] env[59534]: DEBUG oslo_concurrency.lockutils [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Acquired lock "refresh_cache-7552a136-0a44-43a0-909a-3495eddaa4c9" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 709.951250] env[59534]: DEBUG nova.network.neutron [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 709.951566] env[59534]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ec8a4877-a24e-46bb-8566-2962113bace1 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 709.962291] env[59534]: INFO nova.virt.vmwareapi.vm_util [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Created folder: Project (353be627d6be43b0a2cb6bc48c77b78f) in parent group-v280247. [ 709.962484] env[59534]: DEBUG nova.virt.vmwareapi.vm_util [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Creating folder: Instances. Parent ref: group-v280255. {{(pid=59534) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 709.962767] env[59534]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3a870841-17e6-4c26-913f-f30ecb3139b2 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 709.975062] env[59534]: INFO nova.virt.vmwareapi.vm_util [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Created folder: Instances in parent group-v280255. [ 709.975663] env[59534]: DEBUG oslo.service.loopingcall [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59534) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 709.975663] env[59534]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] Creating VM on the ESX host {{(pid=59534) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 709.975814] env[59534]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-50aa4582-ddd5-451d-8eb2-3c84e46c9bd7 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 709.995762] env[59534]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 709.995762] env[59534]: value = "task-1308571" [ 709.995762] env[59534]: _type = "Task" [ 709.995762] env[59534]: } to complete. {{(pid=59534) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 710.004404] env[59534]: DEBUG oslo_vmware.api [-] Task: {'id': task-1308571, 'name': CreateVM_Task} progress is 0%. {{(pid=59534) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 710.029629] env[59534]: DEBUG nova.network.neutron [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 710.511094] env[59534]: DEBUG oslo_vmware.api [-] Task: {'id': task-1308571, 'name': CreateVM_Task, 'duration_secs': 0.241121} completed successfully. {{(pid=59534) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 710.511396] env[59534]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] Created VM on the ESX host {{(pid=59534) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 710.511664] env[59534]: DEBUG oslo_concurrency.lockutils [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/ca8542a2-3ba7-4624-b2be-cd49a340ac21" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 710.511823] env[59534]: DEBUG oslo_concurrency.lockutils [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Acquired lock "[datastore1] devstack-image-cache_base/ca8542a2-3ba7-4624-b2be-cd49a340ac21" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 710.512137] env[59534]: DEBUG oslo_concurrency.lockutils [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/ca8542a2-3ba7-4624-b2be-cd49a340ac21" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 710.512375] env[59534]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7534c4fe-391d-4e66-bdda-4c4a6c51ce45 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 710.517274] env[59534]: DEBUG oslo_vmware.api [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Waiting for the task: (returnval){ [ 710.517274] env[59534]: value = "session[529ab406-b1c9-3c06-ce96-9015eeabf2c3]523dd472-21ac-c550-2554-60b1bc23322a" [ 710.517274] env[59534]: _type = "Task" [ 710.517274] env[59534]: } to complete. {{(pid=59534) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 710.524970] env[59534]: DEBUG oslo_vmware.api [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Task: {'id': session[529ab406-b1c9-3c06-ce96-9015eeabf2c3]523dd472-21ac-c550-2554-60b1bc23322a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59534) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 710.693378] env[59534]: DEBUG nova.network.neutron [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 710.704595] env[59534]: DEBUG oslo_concurrency.lockutils [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Releasing lock "refresh_cache-7552a136-0a44-43a0-909a-3495eddaa4c9" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 710.704826] env[59534]: DEBUG nova.compute.manager [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 710.705041] env[59534]: DEBUG nova.compute.manager [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 710.705330] env[59534]: DEBUG nova.network.neutron [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 710.800667] env[59534]: DEBUG nova.network.neutron [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 710.823893] env[59534]: DEBUG nova.network.neutron [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 710.843192] env[59534]: INFO nova.compute.manager [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Took 0.14 seconds to deallocate network for instance. [ 710.983156] env[59534]: INFO nova.scheduler.client.report [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Deleted allocations for instance 7552a136-0a44-43a0-909a-3495eddaa4c9 [ 711.023652] env[59534]: DEBUG oslo_concurrency.lockutils [None req-1e60474e-9714-4782-89ca-fc2b10d2d3f0 tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Lock "7552a136-0a44-43a0-909a-3495eddaa4c9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 19.619s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 711.024419] env[59534]: DEBUG oslo_concurrency.lockutils [None req-ac2edf85-7cf7-4806-a423-77c859ddddcc tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Lock "7552a136-0a44-43a0-909a-3495eddaa4c9" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 16.221s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 711.024662] env[59534]: DEBUG oslo_concurrency.lockutils [None req-ac2edf85-7cf7-4806-a423-77c859ddddcc tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Acquiring lock "7552a136-0a44-43a0-909a-3495eddaa4c9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 711.024951] env[59534]: DEBUG oslo_concurrency.lockutils [None req-ac2edf85-7cf7-4806-a423-77c859ddddcc tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Lock "7552a136-0a44-43a0-909a-3495eddaa4c9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 711.025138] env[59534]: DEBUG oslo_concurrency.lockutils [None req-ac2edf85-7cf7-4806-a423-77c859ddddcc tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Lock "7552a136-0a44-43a0-909a-3495eddaa4c9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 711.035194] env[59534]: INFO nova.compute.manager [None req-ac2edf85-7cf7-4806-a423-77c859ddddcc tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Terminating instance [ 711.037771] env[59534]: DEBUG oslo_concurrency.lockutils [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Releasing lock "[datastore1] devstack-image-cache_base/ca8542a2-3ba7-4624-b2be-cd49a340ac21" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 711.038132] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] Processing image ca8542a2-3ba7-4624-b2be-cd49a340ac21 {{(pid=59534) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 711.038211] env[59534]: DEBUG oslo_concurrency.lockutils [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/ca8542a2-3ba7-4624-b2be-cd49a340ac21/ca8542a2-3ba7-4624-b2be-cd49a340ac21.vmdk" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 711.038821] env[59534]: DEBUG oslo_concurrency.lockutils [None req-ac2edf85-7cf7-4806-a423-77c859ddddcc tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Acquiring lock "refresh_cache-7552a136-0a44-43a0-909a-3495eddaa4c9" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 711.039043] env[59534]: DEBUG oslo_concurrency.lockutils [None req-ac2edf85-7cf7-4806-a423-77c859ddddcc tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Acquired lock "refresh_cache-7552a136-0a44-43a0-909a-3495eddaa4c9" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 711.040845] env[59534]: DEBUG nova.network.neutron [None req-ac2edf85-7cf7-4806-a423-77c859ddddcc tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 711.215298] env[59534]: DEBUG nova.network.neutron [None req-ac2edf85-7cf7-4806-a423-77c859ddddcc tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 711.315862] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Acquiring lock "8ebf1e27-6cc8-4122-b8b1-ed0585507bdb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 711.316108] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Lock "8ebf1e27-6cc8-4122-b8b1-ed0585507bdb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.003s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 711.328978] env[59534]: DEBUG nova.compute.manager [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] Starting instance... {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 711.382370] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 711.382604] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 711.384073] env[59534]: INFO nova.compute.claims [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 711.548832] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2f85fbe-0c27-411e-8589-4c1f1a218923 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 711.559021] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2767089c-cb33-4093-b0cd-f055d41a415d {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 711.600593] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a643186f-bc6c-443f-ba8f-945afdc05f3d {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 711.610472] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6287979d-5e53-4c90-bcf8-debf0aa3704c {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 711.626800] env[59534]: DEBUG nova.compute.provider_tree [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 711.645387] env[59534]: DEBUG nova.scheduler.client.report [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 711.666155] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.282s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 711.666155] env[59534]: DEBUG nova.compute.manager [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] Start building networks asynchronously for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 711.692239] env[59534]: DEBUG nova.network.neutron [None req-ac2edf85-7cf7-4806-a423-77c859ddddcc tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 711.701550] env[59534]: DEBUG oslo_concurrency.lockutils [None req-ac2edf85-7cf7-4806-a423-77c859ddddcc tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Releasing lock "refresh_cache-7552a136-0a44-43a0-909a-3495eddaa4c9" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 711.703503] env[59534]: DEBUG nova.compute.manager [None req-ac2edf85-7cf7-4806-a423-77c859ddddcc tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Start destroying the instance on the hypervisor. {{(pid=59534) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 711.703503] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-ac2edf85-7cf7-4806-a423-77c859ddddcc tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Destroying instance {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 711.703503] env[59534]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-55cf4287-449d-441f-87e5-9233fccbadcb {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 711.712315] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1990c48a-69c0-43ad-a60a-65adae2832b1 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 711.726230] env[59534]: DEBUG nova.compute.utils [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Using /dev/sd instead of None {{(pid=59534) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 711.728580] env[59534]: DEBUG nova.compute.manager [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] Allocating IP information in the background. {{(pid=59534) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 711.729094] env[59534]: DEBUG nova.network.neutron [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] allocate_for_instance() {{(pid=59534) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 711.741155] env[59534]: WARNING nova.virt.vmwareapi.vmops [None req-ac2edf85-7cf7-4806-a423-77c859ddddcc tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 7552a136-0a44-43a0-909a-3495eddaa4c9 could not be found. [ 711.741362] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-ac2edf85-7cf7-4806-a423-77c859ddddcc tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Instance destroyed {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 711.741536] env[59534]: INFO nova.compute.manager [None req-ac2edf85-7cf7-4806-a423-77c859ddddcc tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Took 0.04 seconds to destroy the instance on the hypervisor. [ 711.742181] env[59534]: DEBUG oslo.service.loopingcall [None req-ac2edf85-7cf7-4806-a423-77c859ddddcc tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59534) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 711.742509] env[59534]: DEBUG nova.compute.manager [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] Start building block device mappings for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 711.745626] env[59534]: DEBUG nova.compute.manager [-] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 711.745626] env[59534]: DEBUG nova.network.neutron [-] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 711.802555] env[59534]: DEBUG nova.network.neutron [-] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 711.810571] env[59534]: DEBUG nova.network.neutron [-] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 711.826495] env[59534]: INFO nova.compute.manager [-] [instance: 7552a136-0a44-43a0-909a-3495eddaa4c9] Took 0.08 seconds to deallocate network for instance. [ 711.837623] env[59534]: DEBUG nova.compute.manager [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] Start spawning the instance on the hypervisor. {{(pid=59534) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 711.887925] env[59534]: DEBUG nova.virt.hardware [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T19:44:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T19:44:03Z,direct_url=,disk_format='vmdk',id=ca8542a2-3ba7-4624-b2be-cd49a340ac21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='31a329da9a7b4c98a2734a8492bf7dec',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T19:44:04Z,virtual_size=,visibility=), allow threads: False {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 711.887925] env[59534]: DEBUG nova.virt.hardware [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Flavor limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 711.887925] env[59534]: DEBUG nova.virt.hardware [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Image limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 711.888222] env[59534]: DEBUG nova.virt.hardware [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Flavor pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 711.888222] env[59534]: DEBUG nova.virt.hardware [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Image pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 711.888222] env[59534]: DEBUG nova.virt.hardware [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 711.888222] env[59534]: DEBUG nova.virt.hardware [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 711.888222] env[59534]: DEBUG nova.virt.hardware [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 711.888351] env[59534]: DEBUG nova.virt.hardware [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Got 1 possible topologies {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 711.888351] env[59534]: DEBUG nova.virt.hardware [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 711.888351] env[59534]: DEBUG nova.virt.hardware [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 711.888979] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1baa4263-aa39-4c90-b0ab-34314c6001e0 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 711.901919] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e427a52c-46cc-4272-a5f2-bf9538418ac5 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 712.034234] env[59534]: DEBUG oslo_concurrency.lockutils [None req-ac2edf85-7cf7-4806-a423-77c859ddddcc tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Lock "7552a136-0a44-43a0-909a-3495eddaa4c9" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.009s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 712.237550] env[59534]: DEBUG nova.policy [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '159c147d52e0452c95dcb771fcd2309d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '02f55d9578bb4104ab4955671dfbff30', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59534) authorize /opt/stack/nova/nova/policy.py:203}} [ 712.882700] env[59534]: ERROR nova.compute.manager [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port e33ba32f-c6a6-4766-af21-edfc90a29ea1, please check neutron logs for more information. [ 712.882700] env[59534]: ERROR nova.compute.manager Traceback (most recent call last): [ 712.882700] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 712.882700] env[59534]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 712.882700] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 712.882700] env[59534]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 712.882700] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 712.882700] env[59534]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 712.882700] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 712.882700] env[59534]: ERROR nova.compute.manager self.force_reraise() [ 712.882700] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 712.882700] env[59534]: ERROR nova.compute.manager raise self.value [ 712.882700] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 712.882700] env[59534]: ERROR nova.compute.manager updated_port = self._update_port( [ 712.882700] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 712.882700] env[59534]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 712.883186] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 712.883186] env[59534]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 712.883186] env[59534]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port e33ba32f-c6a6-4766-af21-edfc90a29ea1, please check neutron logs for more information. [ 712.883186] env[59534]: ERROR nova.compute.manager [ 712.883186] env[59534]: Traceback (most recent call last): [ 712.883186] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 712.883186] env[59534]: listener.cb(fileno) [ 712.883186] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 712.883186] env[59534]: result = function(*args, **kwargs) [ 712.883186] env[59534]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 712.883186] env[59534]: return func(*args, **kwargs) [ 712.883186] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 712.883186] env[59534]: raise e [ 712.883186] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 712.883186] env[59534]: nwinfo = self.network_api.allocate_for_instance( [ 712.883186] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 712.883186] env[59534]: created_port_ids = self._update_ports_for_instance( [ 712.883186] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 712.883186] env[59534]: with excutils.save_and_reraise_exception(): [ 712.883186] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 712.883186] env[59534]: self.force_reraise() [ 712.883186] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 712.883186] env[59534]: raise self.value [ 712.883186] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 712.883186] env[59534]: updated_port = self._update_port( [ 712.883186] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 712.883186] env[59534]: _ensure_no_port_binding_failure(port) [ 712.883186] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 712.883186] env[59534]: raise exception.PortBindingFailed(port_id=port['id']) [ 712.883996] env[59534]: nova.exception.PortBindingFailed: Binding failed for port e33ba32f-c6a6-4766-af21-edfc90a29ea1, please check neutron logs for more information. [ 712.883996] env[59534]: Removing descriptor: 20 [ 712.883996] env[59534]: ERROR nova.compute.manager [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port e33ba32f-c6a6-4766-af21-edfc90a29ea1, please check neutron logs for more information. [ 712.883996] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] Traceback (most recent call last): [ 712.883996] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 712.883996] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] yield resources [ 712.883996] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 712.883996] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] self.driver.spawn(context, instance, image_meta, [ 712.883996] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 712.883996] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] self._vmops.spawn(context, instance, image_meta, injected_files, [ 712.883996] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 712.883996] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] vm_ref = self.build_virtual_machine(instance, [ 712.884317] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 712.884317] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] vif_infos = vmwarevif.get_vif_info(self._session, [ 712.884317] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 712.884317] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] for vif in network_info: [ 712.884317] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 712.884317] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] return self._sync_wrapper(fn, *args, **kwargs) [ 712.884317] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 712.884317] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] self.wait() [ 712.884317] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 712.884317] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] self[:] = self._gt.wait() [ 712.884317] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 712.884317] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] return self._exit_event.wait() [ 712.884317] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 712.884317] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] result = hub.switch() [ 712.884695] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 712.884695] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] return self.greenlet.switch() [ 712.884695] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 712.884695] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] result = function(*args, **kwargs) [ 712.884695] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 712.884695] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] return func(*args, **kwargs) [ 712.884695] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 712.884695] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] raise e [ 712.884695] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 712.884695] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] nwinfo = self.network_api.allocate_for_instance( [ 712.884695] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 712.884695] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] created_port_ids = self._update_ports_for_instance( [ 712.884695] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 712.885086] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] with excutils.save_and_reraise_exception(): [ 712.885086] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 712.885086] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] self.force_reraise() [ 712.885086] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 712.885086] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] raise self.value [ 712.885086] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 712.885086] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] updated_port = self._update_port( [ 712.885086] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 712.885086] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] _ensure_no_port_binding_failure(port) [ 712.885086] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 712.885086] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] raise exception.PortBindingFailed(port_id=port['id']) [ 712.885086] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] nova.exception.PortBindingFailed: Binding failed for port e33ba32f-c6a6-4766-af21-edfc90a29ea1, please check neutron logs for more information. [ 712.885086] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] [ 712.885467] env[59534]: INFO nova.compute.manager [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] Terminating instance [ 712.886555] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Acquiring lock "refresh_cache-86b1de9b-bfc7-4810-9906-e59e01f11594" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 712.886717] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Acquired lock "refresh_cache-86b1de9b-bfc7-4810-9906-e59e01f11594" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 712.886878] env[59534]: DEBUG nova.network.neutron [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 712.956704] env[59534]: DEBUG nova.network.neutron [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 713.823055] env[59534]: DEBUG nova.network.neutron [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 713.836032] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Releasing lock "refresh_cache-86b1de9b-bfc7-4810-9906-e59e01f11594" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 713.836795] env[59534]: DEBUG nova.compute.manager [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] Start destroying the instance on the hypervisor. {{(pid=59534) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 713.837077] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] Destroying instance {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 713.838761] env[59534]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-15d16a6b-01a7-4c16-8208-81943038dd86 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 713.850504] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a214e44c-308d-4e7c-8d96-de7f2162821b {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 713.876584] env[59534]: WARNING nova.virt.vmwareapi.vmops [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 86b1de9b-bfc7-4810-9906-e59e01f11594 could not be found. [ 713.876584] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] Instance destroyed {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 713.876584] env[59534]: INFO nova.compute.manager [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] Took 0.04 seconds to destroy the instance on the hypervisor. [ 713.876584] env[59534]: DEBUG oslo.service.loopingcall [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59534) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 713.876584] env[59534]: DEBUG nova.compute.manager [-] [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 713.876749] env[59534]: DEBUG nova.network.neutron [-] [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 713.972987] env[59534]: DEBUG nova.network.neutron [-] [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 713.982136] env[59534]: DEBUG nova.network.neutron [-] [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 713.993508] env[59534]: INFO nova.compute.manager [-] [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] Took 0.12 seconds to deallocate network for instance. [ 713.995272] env[59534]: DEBUG nova.compute.claims [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] Aborting claim: {{(pid=59534) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 713.995482] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 713.995696] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 714.176195] env[59534]: ERROR nova.compute.manager [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 8b6deb85-5996-4f3a-8207-fca9bf3c99e3, please check neutron logs for more information. [ 714.176195] env[59534]: ERROR nova.compute.manager Traceback (most recent call last): [ 714.176195] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 714.176195] env[59534]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 714.176195] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 714.176195] env[59534]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 714.176195] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 714.176195] env[59534]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 714.176195] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 714.176195] env[59534]: ERROR nova.compute.manager self.force_reraise() [ 714.176195] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 714.176195] env[59534]: ERROR nova.compute.manager raise self.value [ 714.176195] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 714.176195] env[59534]: ERROR nova.compute.manager updated_port = self._update_port( [ 714.176195] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 714.176195] env[59534]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 714.176625] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 714.176625] env[59534]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 714.176625] env[59534]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 8b6deb85-5996-4f3a-8207-fca9bf3c99e3, please check neutron logs for more information. [ 714.176625] env[59534]: ERROR nova.compute.manager [ 714.176625] env[59534]: Traceback (most recent call last): [ 714.176625] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 714.176625] env[59534]: listener.cb(fileno) [ 714.176625] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 714.176625] env[59534]: result = function(*args, **kwargs) [ 714.176625] env[59534]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 714.176625] env[59534]: return func(*args, **kwargs) [ 714.176625] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 714.176625] env[59534]: raise e [ 714.176625] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 714.176625] env[59534]: nwinfo = self.network_api.allocate_for_instance( [ 714.176625] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 714.176625] env[59534]: created_port_ids = self._update_ports_for_instance( [ 714.176625] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 714.176625] env[59534]: with excutils.save_and_reraise_exception(): [ 714.176625] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 714.176625] env[59534]: self.force_reraise() [ 714.176625] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 714.176625] env[59534]: raise self.value [ 714.176625] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 714.176625] env[59534]: updated_port = self._update_port( [ 714.176625] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 714.176625] env[59534]: _ensure_no_port_binding_failure(port) [ 714.176625] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 714.176625] env[59534]: raise exception.PortBindingFailed(port_id=port['id']) [ 714.177317] env[59534]: nova.exception.PortBindingFailed: Binding failed for port 8b6deb85-5996-4f3a-8207-fca9bf3c99e3, please check neutron logs for more information. [ 714.177317] env[59534]: Removing descriptor: 18 [ 714.177317] env[59534]: ERROR nova.compute.manager [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 8b6deb85-5996-4f3a-8207-fca9bf3c99e3, please check neutron logs for more information. [ 714.177317] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] Traceback (most recent call last): [ 714.177317] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 714.177317] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] yield resources [ 714.177317] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 714.177317] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] self.driver.spawn(context, instance, image_meta, [ 714.177317] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 714.177317] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 714.177317] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 714.177317] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] vm_ref = self.build_virtual_machine(instance, [ 714.177600] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 714.177600] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] vif_infos = vmwarevif.get_vif_info(self._session, [ 714.177600] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 714.177600] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] for vif in network_info: [ 714.177600] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 714.177600] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] return self._sync_wrapper(fn, *args, **kwargs) [ 714.177600] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 714.177600] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] self.wait() [ 714.177600] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 714.177600] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] self[:] = self._gt.wait() [ 714.177600] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 714.177600] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] return self._exit_event.wait() [ 714.177600] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 714.177600] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] result = hub.switch() [ 714.177920] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 714.177920] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] return self.greenlet.switch() [ 714.177920] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 714.177920] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] result = function(*args, **kwargs) [ 714.177920] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 714.177920] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] return func(*args, **kwargs) [ 714.177920] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 714.177920] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] raise e [ 714.177920] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 714.177920] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] nwinfo = self.network_api.allocate_for_instance( [ 714.177920] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 714.177920] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] created_port_ids = self._update_ports_for_instance( [ 714.177920] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 714.178226] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] with excutils.save_and_reraise_exception(): [ 714.178226] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 714.178226] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] self.force_reraise() [ 714.178226] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 714.178226] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] raise self.value [ 714.178226] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 714.178226] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] updated_port = self._update_port( [ 714.178226] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 714.178226] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] _ensure_no_port_binding_failure(port) [ 714.178226] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 714.178226] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] raise exception.PortBindingFailed(port_id=port['id']) [ 714.178226] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] nova.exception.PortBindingFailed: Binding failed for port 8b6deb85-5996-4f3a-8207-fca9bf3c99e3, please check neutron logs for more information. [ 714.178226] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] [ 714.178523] env[59534]: INFO nova.compute.manager [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] Terminating instance [ 714.180340] env[59534]: DEBUG oslo_concurrency.lockutils [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Acquiring lock "refresh_cache-11ff4621-5da2-4da3-97bf-2ff6b231233c" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 714.180507] env[59534]: DEBUG oslo_concurrency.lockutils [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Acquired lock "refresh_cache-11ff4621-5da2-4da3-97bf-2ff6b231233c" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 714.180673] env[59534]: DEBUG nova.network.neutron [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 714.233459] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f1584bf-575b-4764-99d1-db56ce9ea3de {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.247400] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-54ef1304-b080-4043-bbad-e325937351d9 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.282112] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82332e1c-8577-4db2-a71d-764adaacd445 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.289743] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3e024b3-cb1e-4959-a810-db9bec0cdccb {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.310984] env[59534]: DEBUG nova.compute.provider_tree [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 714.313558] env[59534]: DEBUG nova.network.neutron [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] Successfully created port: 2e9d9d04-ebae-4284-8649-586e770f1bfd {{(pid=59534) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 714.319991] env[59534]: DEBUG nova.network.neutron [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 714.323758] env[59534]: DEBUG nova.scheduler.client.report [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 714.346784] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.351s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 714.347418] env[59534]: ERROR nova.compute.manager [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port e33ba32f-c6a6-4766-af21-edfc90a29ea1, please check neutron logs for more information. [ 714.347418] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] Traceback (most recent call last): [ 714.347418] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 714.347418] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] self.driver.spawn(context, instance, image_meta, [ 714.347418] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 714.347418] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] self._vmops.spawn(context, instance, image_meta, injected_files, [ 714.347418] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 714.347418] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] vm_ref = self.build_virtual_machine(instance, [ 714.347418] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 714.347418] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] vif_infos = vmwarevif.get_vif_info(self._session, [ 714.347418] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 714.347741] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] for vif in network_info: [ 714.347741] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 714.347741] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] return self._sync_wrapper(fn, *args, **kwargs) [ 714.347741] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 714.347741] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] self.wait() [ 714.347741] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 714.347741] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] self[:] = self._gt.wait() [ 714.347741] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 714.347741] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] return self._exit_event.wait() [ 714.347741] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 714.347741] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] result = hub.switch() [ 714.347741] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 714.347741] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] return self.greenlet.switch() [ 714.347741] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 714.348194] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] result = function(*args, **kwargs) [ 714.348194] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 714.348194] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] return func(*args, **kwargs) [ 714.348194] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 714.348194] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] raise e [ 714.348194] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 714.348194] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] nwinfo = self.network_api.allocate_for_instance( [ 714.348194] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 714.348194] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] created_port_ids = self._update_ports_for_instance( [ 714.348194] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 714.348194] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] with excutils.save_and_reraise_exception(): [ 714.348194] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 714.348194] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] self.force_reraise() [ 714.348497] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 714.348497] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] raise self.value [ 714.348497] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 714.348497] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] updated_port = self._update_port( [ 714.348497] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 714.348497] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] _ensure_no_port_binding_failure(port) [ 714.348497] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 714.348497] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] raise exception.PortBindingFailed(port_id=port['id']) [ 714.348497] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] nova.exception.PortBindingFailed: Binding failed for port e33ba32f-c6a6-4766-af21-edfc90a29ea1, please check neutron logs for more information. [ 714.348497] env[59534]: ERROR nova.compute.manager [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] [ 714.348497] env[59534]: DEBUG nova.compute.utils [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] Binding failed for port e33ba32f-c6a6-4766-af21-edfc90a29ea1, please check neutron logs for more information. {{(pid=59534) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 714.355954] env[59534]: DEBUG nova.compute.manager [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] Build of instance 86b1de9b-bfc7-4810-9906-e59e01f11594 was re-scheduled: Binding failed for port e33ba32f-c6a6-4766-af21-edfc90a29ea1, please check neutron logs for more information. {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 714.355954] env[59534]: DEBUG nova.compute.manager [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] Unplugging VIFs for instance {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 714.355954] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Acquiring lock "refresh_cache-86b1de9b-bfc7-4810-9906-e59e01f11594" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 714.355954] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Acquired lock "refresh_cache-86b1de9b-bfc7-4810-9906-e59e01f11594" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 714.356623] env[59534]: DEBUG nova.network.neutron [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 714.706178] env[59534]: DEBUG nova.network.neutron [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 714.810528] env[59534]: ERROR nova.compute.manager [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port bd89f1d4-a239-4ce1-91f2-766e8f9cf459, please check neutron logs for more information. [ 714.810528] env[59534]: ERROR nova.compute.manager Traceback (most recent call last): [ 714.810528] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 714.810528] env[59534]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 714.810528] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 714.810528] env[59534]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 714.810528] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 714.810528] env[59534]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 714.810528] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 714.810528] env[59534]: ERROR nova.compute.manager self.force_reraise() [ 714.810528] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 714.810528] env[59534]: ERROR nova.compute.manager raise self.value [ 714.810528] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 714.810528] env[59534]: ERROR nova.compute.manager updated_port = self._update_port( [ 714.810528] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 714.810528] env[59534]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 714.811030] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 714.811030] env[59534]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 714.811030] env[59534]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port bd89f1d4-a239-4ce1-91f2-766e8f9cf459, please check neutron logs for more information. [ 714.811030] env[59534]: ERROR nova.compute.manager [ 714.811030] env[59534]: Traceback (most recent call last): [ 714.811030] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 714.811030] env[59534]: listener.cb(fileno) [ 714.811030] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 714.811030] env[59534]: result = function(*args, **kwargs) [ 714.811030] env[59534]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 714.811030] env[59534]: return func(*args, **kwargs) [ 714.811030] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 714.811030] env[59534]: raise e [ 714.811030] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 714.811030] env[59534]: nwinfo = self.network_api.allocate_for_instance( [ 714.811030] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 714.811030] env[59534]: created_port_ids = self._update_ports_for_instance( [ 714.811030] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 714.811030] env[59534]: with excutils.save_and_reraise_exception(): [ 714.811030] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 714.811030] env[59534]: self.force_reraise() [ 714.811030] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 714.811030] env[59534]: raise self.value [ 714.811030] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 714.811030] env[59534]: updated_port = self._update_port( [ 714.811030] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 714.811030] env[59534]: _ensure_no_port_binding_failure(port) [ 714.811030] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 714.811030] env[59534]: raise exception.PortBindingFailed(port_id=port['id']) [ 714.811679] env[59534]: nova.exception.PortBindingFailed: Binding failed for port bd89f1d4-a239-4ce1-91f2-766e8f9cf459, please check neutron logs for more information. [ 714.811679] env[59534]: Removing descriptor: 19 [ 714.811679] env[59534]: ERROR nova.compute.manager [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port bd89f1d4-a239-4ce1-91f2-766e8f9cf459, please check neutron logs for more information. [ 714.811679] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] Traceback (most recent call last): [ 714.811679] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 714.811679] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] yield resources [ 714.811679] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 714.811679] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] self.driver.spawn(context, instance, image_meta, [ 714.811679] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 714.811679] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] self._vmops.spawn(context, instance, image_meta, injected_files, [ 714.811679] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 714.811679] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] vm_ref = self.build_virtual_machine(instance, [ 714.811951] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 714.811951] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] vif_infos = vmwarevif.get_vif_info(self._session, [ 714.811951] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 714.811951] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] for vif in network_info: [ 714.811951] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 714.811951] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] return self._sync_wrapper(fn, *args, **kwargs) [ 714.811951] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 714.811951] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] self.wait() [ 714.811951] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 714.811951] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] self[:] = self._gt.wait() [ 714.811951] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 714.811951] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] return self._exit_event.wait() [ 714.811951] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 714.811951] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] result = hub.switch() [ 714.812287] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 714.812287] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] return self.greenlet.switch() [ 714.812287] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 714.812287] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] result = function(*args, **kwargs) [ 714.812287] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 714.812287] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] return func(*args, **kwargs) [ 714.812287] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 714.812287] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] raise e [ 714.812287] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 714.812287] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] nwinfo = self.network_api.allocate_for_instance( [ 714.812287] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 714.812287] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] created_port_ids = self._update_ports_for_instance( [ 714.812287] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 714.812574] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] with excutils.save_and_reraise_exception(): [ 714.812574] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 714.812574] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] self.force_reraise() [ 714.812574] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 714.812574] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] raise self.value [ 714.812574] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 714.812574] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] updated_port = self._update_port( [ 714.812574] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 714.812574] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] _ensure_no_port_binding_failure(port) [ 714.812574] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 714.812574] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] raise exception.PortBindingFailed(port_id=port['id']) [ 714.812574] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] nova.exception.PortBindingFailed: Binding failed for port bd89f1d4-a239-4ce1-91f2-766e8f9cf459, please check neutron logs for more information. [ 714.812574] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] [ 714.812892] env[59534]: INFO nova.compute.manager [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] Terminating instance [ 714.813841] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Acquiring lock "refresh_cache-1b3f51ff-2374-499e-8d60-6a0e5cdd2609" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 714.813994] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Acquired lock "refresh_cache-1b3f51ff-2374-499e-8d60-6a0e5cdd2609" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 714.814172] env[59534]: DEBUG nova.network.neutron [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 714.910806] env[59534]: DEBUG nova.network.neutron [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 715.036247] env[59534]: DEBUG nova.network.neutron [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 715.047619] env[59534]: DEBUG oslo_concurrency.lockutils [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Releasing lock "refresh_cache-11ff4621-5da2-4da3-97bf-2ff6b231233c" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 715.048676] env[59534]: DEBUG nova.compute.manager [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] Start destroying the instance on the hypervisor. {{(pid=59534) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 715.048676] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] Destroying instance {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 715.048834] env[59534]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-da79d615-88bb-4e01-b5dc-9723d5c5cb13 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.060118] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9bd47f2a-73eb-49fc-9b56-8301f78d32c1 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.084742] env[59534]: WARNING nova.virt.vmwareapi.vmops [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 11ff4621-5da2-4da3-97bf-2ff6b231233c could not be found. [ 715.084992] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] Instance destroyed {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 715.086129] env[59534]: INFO nova.compute.manager [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] Took 0.04 seconds to destroy the instance on the hypervisor. [ 715.086129] env[59534]: DEBUG oslo.service.loopingcall [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59534) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 715.086129] env[59534]: DEBUG nova.compute.manager [-] [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 715.086129] env[59534]: DEBUG nova.network.neutron [-] [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 715.189985] env[59534]: DEBUG nova.network.neutron [-] [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 715.198193] env[59534]: DEBUG nova.network.neutron [-] [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 715.207506] env[59534]: INFO nova.compute.manager [-] [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] Took 0.12 seconds to deallocate network for instance. [ 715.209494] env[59534]: DEBUG nova.compute.claims [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] Aborting claim: {{(pid=59534) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 715.209665] env[59534]: DEBUG oslo_concurrency.lockutils [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 715.209868] env[59534]: DEBUG oslo_concurrency.lockutils [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 715.395061] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fbdadf2d-2f8d-4e1e-ad67-26e822959199 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.402517] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3fb7b408-e11f-48ec-8846-39e6ab4a80f7 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.411018] env[59534]: DEBUG nova.network.neutron [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 715.434767] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Releasing lock "refresh_cache-86b1de9b-bfc7-4810-9906-e59e01f11594" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 715.435017] env[59534]: DEBUG nova.compute.manager [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 715.435198] env[59534]: DEBUG nova.compute.manager [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 715.435359] env[59534]: DEBUG nova.network.neutron [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 715.437541] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84e858c3-aed4-4b63-98e0-e590e66ca695 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.446201] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af09ffd9-90d9-49bd-aaae-040d32648cc1 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.460132] env[59534]: DEBUG nova.compute.provider_tree [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 715.470180] env[59534]: DEBUG nova.scheduler.client.report [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 715.485053] env[59534]: DEBUG oslo_concurrency.lockutils [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.275s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 715.485660] env[59534]: ERROR nova.compute.manager [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 8b6deb85-5996-4f3a-8207-fca9bf3c99e3, please check neutron logs for more information. [ 715.485660] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] Traceback (most recent call last): [ 715.485660] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 715.485660] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] self.driver.spawn(context, instance, image_meta, [ 715.485660] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 715.485660] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 715.485660] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 715.485660] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] vm_ref = self.build_virtual_machine(instance, [ 715.485660] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 715.485660] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] vif_infos = vmwarevif.get_vif_info(self._session, [ 715.485660] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 715.486344] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] for vif in network_info: [ 715.486344] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 715.486344] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] return self._sync_wrapper(fn, *args, **kwargs) [ 715.486344] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 715.486344] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] self.wait() [ 715.486344] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 715.486344] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] self[:] = self._gt.wait() [ 715.486344] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 715.486344] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] return self._exit_event.wait() [ 715.486344] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 715.486344] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] result = hub.switch() [ 715.486344] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 715.486344] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] return self.greenlet.switch() [ 715.486344] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 715.487089] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] result = function(*args, **kwargs) [ 715.487089] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 715.487089] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] return func(*args, **kwargs) [ 715.487089] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 715.487089] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] raise e [ 715.487089] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 715.487089] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] nwinfo = self.network_api.allocate_for_instance( [ 715.487089] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 715.487089] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] created_port_ids = self._update_ports_for_instance( [ 715.487089] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 715.487089] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] with excutils.save_and_reraise_exception(): [ 715.487089] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 715.487089] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] self.force_reraise() [ 715.487613] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 715.487613] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] raise self.value [ 715.487613] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 715.487613] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] updated_port = self._update_port( [ 715.487613] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 715.487613] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] _ensure_no_port_binding_failure(port) [ 715.487613] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 715.487613] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] raise exception.PortBindingFailed(port_id=port['id']) [ 715.487613] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] nova.exception.PortBindingFailed: Binding failed for port 8b6deb85-5996-4f3a-8207-fca9bf3c99e3, please check neutron logs for more information. [ 715.487613] env[59534]: ERROR nova.compute.manager [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] [ 715.487613] env[59534]: DEBUG nova.compute.utils [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] Binding failed for port 8b6deb85-5996-4f3a-8207-fca9bf3c99e3, please check neutron logs for more information. {{(pid=59534) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 715.489249] env[59534]: DEBUG nova.network.neutron [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 715.489368] env[59534]: DEBUG nova.compute.manager [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] Build of instance 11ff4621-5da2-4da3-97bf-2ff6b231233c was re-scheduled: Binding failed for port 8b6deb85-5996-4f3a-8207-fca9bf3c99e3, please check neutron logs for more information. {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 715.489757] env[59534]: DEBUG nova.compute.manager [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] Unplugging VIFs for instance {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 715.489968] env[59534]: DEBUG oslo_concurrency.lockutils [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Acquiring lock "refresh_cache-11ff4621-5da2-4da3-97bf-2ff6b231233c" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 715.490125] env[59534]: DEBUG oslo_concurrency.lockutils [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Acquired lock "refresh_cache-11ff4621-5da2-4da3-97bf-2ff6b231233c" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 715.490278] env[59534]: DEBUG nova.network.neutron [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 715.497147] env[59534]: DEBUG nova.network.neutron [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 715.508626] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Releasing lock "refresh_cache-1b3f51ff-2374-499e-8d60-6a0e5cdd2609" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 715.508994] env[59534]: DEBUG nova.compute.manager [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] Start destroying the instance on the hypervisor. {{(pid=59534) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 715.509196] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] Destroying instance {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 715.509840] env[59534]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-cbcae441-d25e-4425-86bf-30ff95bfba95 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.511845] env[59534]: DEBUG nova.network.neutron [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 715.519762] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f044f5bf-1e20-4fbf-a5b8-30e9d111478b {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.530599] env[59534]: INFO nova.compute.manager [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 86b1de9b-bfc7-4810-9906-e59e01f11594] Took 0.10 seconds to deallocate network for instance. [ 715.546446] env[59534]: WARNING nova.virt.vmwareapi.vmops [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 1b3f51ff-2374-499e-8d60-6a0e5cdd2609 could not be found. [ 715.546659] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] Instance destroyed {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 715.546829] env[59534]: INFO nova.compute.manager [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] Took 0.04 seconds to destroy the instance on the hypervisor. [ 715.547071] env[59534]: DEBUG oslo.service.loopingcall [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59534) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 715.547266] env[59534]: DEBUG nova.compute.manager [-] [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 715.547356] env[59534]: DEBUG nova.network.neutron [-] [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 715.621858] env[59534]: INFO nova.scheduler.client.report [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Deleted allocations for instance 86b1de9b-bfc7-4810-9906-e59e01f11594 [ 715.638356] env[59534]: DEBUG nova.network.neutron [-] [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 715.648205] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8e0a0a12-bbba-4b09-a2e2-0f0146ebae15 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Lock "86b1de9b-bfc7-4810-9906-e59e01f11594" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 18.250s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 715.655576] env[59534]: DEBUG nova.network.neutron [-] [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 715.665307] env[59534]: INFO nova.compute.manager [-] [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] Took 0.12 seconds to deallocate network for instance. [ 715.670633] env[59534]: DEBUG nova.compute.claims [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] Aborting claim: {{(pid=59534) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 715.670633] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 715.670633] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 715.804930] env[59534]: DEBUG nova.network.neutron [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 715.866331] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15a88ba8-62ba-4cb2-a0d2-b4aaf6a9f76d {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.876970] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a0e28f9-ac73-4ff4-8d9b-e3622dfbfc8f {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.921390] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3926a65-9d89-40c9-ade5-cc0dcff84896 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.929904] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46878949-f74f-4f57-8981-84faf8bea62f {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.944669] env[59534]: DEBUG nova.compute.provider_tree [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 715.962606] env[59534]: DEBUG nova.scheduler.client.report [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 715.992370] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.319s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 715.992370] env[59534]: ERROR nova.compute.manager [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port bd89f1d4-a239-4ce1-91f2-766e8f9cf459, please check neutron logs for more information. [ 715.992370] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] Traceback (most recent call last): [ 715.992370] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 715.992370] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] self.driver.spawn(context, instance, image_meta, [ 715.992370] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 715.992370] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] self._vmops.spawn(context, instance, image_meta, injected_files, [ 715.992370] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 715.992370] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] vm_ref = self.build_virtual_machine(instance, [ 715.992771] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 715.992771] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] vif_infos = vmwarevif.get_vif_info(self._session, [ 715.992771] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 715.992771] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] for vif in network_info: [ 715.992771] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 715.992771] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] return self._sync_wrapper(fn, *args, **kwargs) [ 715.992771] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 715.992771] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] self.wait() [ 715.992771] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 715.992771] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] self[:] = self._gt.wait() [ 715.992771] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 715.992771] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] return self._exit_event.wait() [ 715.992771] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 715.992771] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] result = hub.switch() [ 715.993120] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 715.993120] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] return self.greenlet.switch() [ 715.993120] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 715.993120] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] result = function(*args, **kwargs) [ 715.993120] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 715.993120] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] return func(*args, **kwargs) [ 715.993120] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 715.993120] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] raise e [ 715.993120] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 715.993120] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] nwinfo = self.network_api.allocate_for_instance( [ 715.993120] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 715.993120] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] created_port_ids = self._update_ports_for_instance( [ 715.993120] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 715.993413] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] with excutils.save_and_reraise_exception(): [ 715.993413] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 715.993413] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] self.force_reraise() [ 715.993413] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 715.993413] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] raise self.value [ 715.993413] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 715.993413] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] updated_port = self._update_port( [ 715.993413] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 715.993413] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] _ensure_no_port_binding_failure(port) [ 715.993413] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 715.993413] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] raise exception.PortBindingFailed(port_id=port['id']) [ 715.993413] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] nova.exception.PortBindingFailed: Binding failed for port bd89f1d4-a239-4ce1-91f2-766e8f9cf459, please check neutron logs for more information. [ 715.993413] env[59534]: ERROR nova.compute.manager [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] [ 715.993737] env[59534]: DEBUG nova.compute.utils [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] Binding failed for port bd89f1d4-a239-4ce1-91f2-766e8f9cf459, please check neutron logs for more information. {{(pid=59534) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 715.995966] env[59534]: DEBUG nova.compute.manager [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] Build of instance 1b3f51ff-2374-499e-8d60-6a0e5cdd2609 was re-scheduled: Binding failed for port bd89f1d4-a239-4ce1-91f2-766e8f9cf459, please check neutron logs for more information. {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 715.995966] env[59534]: DEBUG nova.compute.manager [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] Unplugging VIFs for instance {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 715.995966] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Acquiring lock "refresh_cache-1b3f51ff-2374-499e-8d60-6a0e5cdd2609" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 715.995966] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Acquired lock "refresh_cache-1b3f51ff-2374-499e-8d60-6a0e5cdd2609" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 715.997275] env[59534]: DEBUG nova.network.neutron [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 716.063925] env[59534]: DEBUG nova.network.neutron [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 716.077075] env[59534]: DEBUG oslo_concurrency.lockutils [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Acquiring lock "880bddac-daec-4194-8e5d-e7aaba8c2dd1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 716.077291] env[59534]: DEBUG oslo_concurrency.lockutils [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Lock "880bddac-daec-4194-8e5d-e7aaba8c2dd1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 716.086481] env[59534]: DEBUG nova.compute.manager [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] Starting instance... {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 716.096717] env[59534]: ERROR nova.compute.manager [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 69c955c3-52ad-4fe7-a327-c43a02064bd0, please check neutron logs for more information. [ 716.096717] env[59534]: ERROR nova.compute.manager Traceback (most recent call last): [ 716.096717] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 716.096717] env[59534]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 716.096717] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 716.096717] env[59534]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 716.096717] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 716.096717] env[59534]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 716.096717] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 716.096717] env[59534]: ERROR nova.compute.manager self.force_reraise() [ 716.096717] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 716.096717] env[59534]: ERROR nova.compute.manager raise self.value [ 716.096717] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 716.096717] env[59534]: ERROR nova.compute.manager updated_port = self._update_port( [ 716.096717] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 716.096717] env[59534]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 716.097210] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 716.097210] env[59534]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 716.097210] env[59534]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 69c955c3-52ad-4fe7-a327-c43a02064bd0, please check neutron logs for more information. [ 716.097210] env[59534]: ERROR nova.compute.manager [ 716.097210] env[59534]: Traceback (most recent call last): [ 716.097210] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 716.097210] env[59534]: listener.cb(fileno) [ 716.097210] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 716.097210] env[59534]: result = function(*args, **kwargs) [ 716.097210] env[59534]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 716.097210] env[59534]: return func(*args, **kwargs) [ 716.097210] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 716.097210] env[59534]: raise e [ 716.097210] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 716.097210] env[59534]: nwinfo = self.network_api.allocate_for_instance( [ 716.097210] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 716.097210] env[59534]: created_port_ids = self._update_ports_for_instance( [ 716.097210] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 716.097210] env[59534]: with excutils.save_and_reraise_exception(): [ 716.097210] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 716.097210] env[59534]: self.force_reraise() [ 716.097210] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 716.097210] env[59534]: raise self.value [ 716.097210] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 716.097210] env[59534]: updated_port = self._update_port( [ 716.097210] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 716.097210] env[59534]: _ensure_no_port_binding_failure(port) [ 716.097210] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 716.097210] env[59534]: raise exception.PortBindingFailed(port_id=port['id']) [ 716.097931] env[59534]: nova.exception.PortBindingFailed: Binding failed for port 69c955c3-52ad-4fe7-a327-c43a02064bd0, please check neutron logs for more information. [ 716.097931] env[59534]: Removing descriptor: 17 [ 716.097931] env[59534]: ERROR nova.compute.manager [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 69c955c3-52ad-4fe7-a327-c43a02064bd0, please check neutron logs for more information. [ 716.097931] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] Traceback (most recent call last): [ 716.097931] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 716.097931] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] yield resources [ 716.097931] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 716.097931] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] self.driver.spawn(context, instance, image_meta, [ 716.097931] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 716.097931] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] self._vmops.spawn(context, instance, image_meta, injected_files, [ 716.097931] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 716.097931] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] vm_ref = self.build_virtual_machine(instance, [ 716.098245] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 716.098245] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] vif_infos = vmwarevif.get_vif_info(self._session, [ 716.098245] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 716.098245] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] for vif in network_info: [ 716.098245] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 716.098245] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] return self._sync_wrapper(fn, *args, **kwargs) [ 716.098245] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 716.098245] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] self.wait() [ 716.098245] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 716.098245] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] self[:] = self._gt.wait() [ 716.098245] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 716.098245] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] return self._exit_event.wait() [ 716.098245] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 716.098245] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] result = hub.switch() [ 716.098793] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 716.098793] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] return self.greenlet.switch() [ 716.098793] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 716.098793] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] result = function(*args, **kwargs) [ 716.098793] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 716.098793] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] return func(*args, **kwargs) [ 716.098793] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 716.098793] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] raise e [ 716.098793] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 716.098793] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] nwinfo = self.network_api.allocate_for_instance( [ 716.098793] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 716.098793] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] created_port_ids = self._update_ports_for_instance( [ 716.098793] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 716.099136] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] with excutils.save_and_reraise_exception(): [ 716.099136] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 716.099136] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] self.force_reraise() [ 716.099136] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 716.099136] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] raise self.value [ 716.099136] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 716.099136] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] updated_port = self._update_port( [ 716.099136] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 716.099136] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] _ensure_no_port_binding_failure(port) [ 716.099136] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 716.099136] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] raise exception.PortBindingFailed(port_id=port['id']) [ 716.099136] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] nova.exception.PortBindingFailed: Binding failed for port 69c955c3-52ad-4fe7-a327-c43a02064bd0, please check neutron logs for more information. [ 716.099136] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] [ 716.099474] env[59534]: INFO nova.compute.manager [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] Terminating instance [ 716.102925] env[59534]: DEBUG oslo_concurrency.lockutils [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] Acquiring lock "refresh_cache-68f534dd-7119-48f4-85ae-14bfdf68d486" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 716.102925] env[59534]: DEBUG oslo_concurrency.lockutils [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] Acquired lock "refresh_cache-68f534dd-7119-48f4-85ae-14bfdf68d486" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 716.102925] env[59534]: DEBUG nova.network.neutron [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 716.146858] env[59534]: DEBUG oslo_concurrency.lockutils [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 716.147086] env[59534]: DEBUG oslo_concurrency.lockutils [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 716.148757] env[59534]: INFO nova.compute.claims [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 716.207851] env[59534]: DEBUG nova.network.neutron [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 716.307308] env[59534]: DEBUG nova.network.neutron [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 716.309649] env[59534]: DEBUG nova.network.neutron [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 716.320887] env[59534]: DEBUG oslo_concurrency.lockutils [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Releasing lock "refresh_cache-11ff4621-5da2-4da3-97bf-2ff6b231233c" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 716.320887] env[59534]: DEBUG nova.compute.manager [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 716.321031] env[59534]: DEBUG nova.compute.manager [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 716.321098] env[59534]: DEBUG nova.network.neutron [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 716.323202] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Releasing lock "refresh_cache-1b3f51ff-2374-499e-8d60-6a0e5cdd2609" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 716.323592] env[59534]: DEBUG nova.compute.manager [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 716.323592] env[59534]: DEBUG nova.compute.manager [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 716.323693] env[59534]: DEBUG nova.network.neutron [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 716.351194] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3c389c7-1424-4cab-9216-1eb3948aecda {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.362197] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16e0054d-65be-493f-931f-8f60560e55da {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.398493] env[59534]: DEBUG nova.network.neutron [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 716.401133] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee3f551d-2b9e-4415-8310-ff7c668b89ab {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.406355] env[59534]: DEBUG nova.network.neutron [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 716.412414] env[59534]: DEBUG nova.network.neutron [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 716.414328] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-287abc16-b0a8-4e58-bcd6-c9580a3561b1 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.419913] env[59534]: INFO nova.compute.manager [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 11ff4621-5da2-4da3-97bf-2ff6b231233c] Took 0.10 seconds to deallocate network for instance. [ 716.431371] env[59534]: DEBUG nova.compute.provider_tree [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 716.432903] env[59534]: DEBUG nova.network.neutron [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 716.442029] env[59534]: DEBUG nova.scheduler.client.report [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 716.452188] env[59534]: INFO nova.compute.manager [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] [instance: 1b3f51ff-2374-499e-8d60-6a0e5cdd2609] Took 0.13 seconds to deallocate network for instance. [ 716.455429] env[59534]: DEBUG oslo_concurrency.lockutils [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.308s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 716.455933] env[59534]: DEBUG nova.compute.manager [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] Start building networks asynchronously for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 716.489126] env[59534]: DEBUG nova.compute.utils [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Using /dev/sd instead of None {{(pid=59534) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 716.492723] env[59534]: DEBUG nova.compute.manager [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] Allocating IP information in the background. {{(pid=59534) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 716.493067] env[59534]: DEBUG nova.network.neutron [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] allocate_for_instance() {{(pid=59534) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 716.498336] env[59534]: DEBUG nova.compute.manager [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] Start building block device mappings for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 716.553819] env[59534]: INFO nova.scheduler.client.report [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Deleted allocations for instance 11ff4621-5da2-4da3-97bf-2ff6b231233c [ 716.587549] env[59534]: DEBUG nova.compute.manager [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] Start spawning the instance on the hypervisor. {{(pid=59534) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 716.590797] env[59534]: INFO nova.scheduler.client.report [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Deleted allocations for instance 1b3f51ff-2374-499e-8d60-6a0e5cdd2609 [ 716.597831] env[59534]: DEBUG oslo_concurrency.lockutils [None req-08927b82-ba38-431b-b04d-3609bb83d66b tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Lock "11ff4621-5da2-4da3-97bf-2ff6b231233c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 18.189s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 716.616239] env[59534]: DEBUG nova.virt.hardware [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T19:44:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T19:44:03Z,direct_url=,disk_format='vmdk',id=ca8542a2-3ba7-4624-b2be-cd49a340ac21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='31a329da9a7b4c98a2734a8492bf7dec',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T19:44:04Z,virtual_size=,visibility=), allow threads: False {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 716.616524] env[59534]: DEBUG nova.virt.hardware [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Flavor limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 716.616990] env[59534]: DEBUG nova.virt.hardware [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Image limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 716.616990] env[59534]: DEBUG nova.virt.hardware [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Flavor pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 716.616990] env[59534]: DEBUG nova.virt.hardware [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Image pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 716.617148] env[59534]: DEBUG nova.virt.hardware [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 716.617854] env[59534]: DEBUG nova.virt.hardware [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 716.617854] env[59534]: DEBUG nova.virt.hardware [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 716.617854] env[59534]: DEBUG nova.virt.hardware [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Got 1 possible topologies {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 716.617854] env[59534]: DEBUG nova.virt.hardware [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 716.618060] env[59534]: DEBUG nova.virt.hardware [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 716.619956] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42d8fd42-278a-4896-b35d-27e12ab956a2 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.622865] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8178dce7-6b3e-4984-8bfa-5132bf8627a1 tempest-ListImageFiltersTestJSON-769684321 tempest-ListImageFiltersTestJSON-769684321-project-member] Lock "1b3f51ff-2374-499e-8d60-6a0e5cdd2609" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 14.792s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 716.629959] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6551ed9-1f25-41c4-8282-ad3c846ec37a {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.054313] env[59534]: DEBUG nova.network.neutron [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 717.070968] env[59534]: DEBUG oslo_concurrency.lockutils [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] Releasing lock "refresh_cache-68f534dd-7119-48f4-85ae-14bfdf68d486" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 717.070968] env[59534]: DEBUG nova.compute.manager [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] Start destroying the instance on the hypervisor. {{(pid=59534) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 717.071180] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] Destroying instance {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 717.071685] env[59534]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-cfbf104d-ebdd-41f2-bde9-ef0175aec0b4 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.081390] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f68768d9-b2f1-4656-ac9f-5b21547ee74a {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.097680] env[59534]: DEBUG nova.policy [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '159c147d52e0452c95dcb771fcd2309d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '02f55d9578bb4104ab4955671dfbff30', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59534) authorize /opt/stack/nova/nova/policy.py:203}} [ 717.106014] env[59534]: WARNING nova.virt.vmwareapi.vmops [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 68f534dd-7119-48f4-85ae-14bfdf68d486 could not be found. [ 717.106344] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] Instance destroyed {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 717.106456] env[59534]: INFO nova.compute.manager [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] Took 0.04 seconds to destroy the instance on the hypervisor. [ 717.107123] env[59534]: DEBUG oslo.service.loopingcall [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59534) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 717.107213] env[59534]: DEBUG nova.compute.manager [-] [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 717.107284] env[59534]: DEBUG nova.network.neutron [-] [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 717.250388] env[59534]: DEBUG nova.network.neutron [-] [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 717.265703] env[59534]: DEBUG nova.network.neutron [-] [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 717.285908] env[59534]: INFO nova.compute.manager [-] [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] Took 0.18 seconds to deallocate network for instance. [ 717.291917] env[59534]: DEBUG nova.compute.claims [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] Aborting claim: {{(pid=59534) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 717.291917] env[59534]: DEBUG oslo_concurrency.lockutils [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 717.291917] env[59534]: DEBUG oslo_concurrency.lockutils [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 717.451182] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-133bf234-a31f-4a04-ae18-cf12efd8fa30 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.460867] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-670a26b9-a468-48c5-9e35-26788196a0a7 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.498423] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18b5cf2d-7aff-4457-8988-09fb2f53ae69 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.509108] env[59534]: DEBUG oslo_concurrency.lockutils [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] Acquiring lock "9ef14f08-3ae3-48cb-ae99-2b2731faeeab" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 717.509513] env[59534]: DEBUG oslo_concurrency.lockutils [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] Lock "9ef14f08-3ae3-48cb-ae99-2b2731faeeab" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 717.519467] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e29f004-f7d0-47e5-a86c-8fc74df6b514 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.527593] env[59534]: DEBUG nova.compute.manager [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] Starting instance... {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 717.543441] env[59534]: DEBUG nova.compute.provider_tree [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 717.552773] env[59534]: DEBUG nova.scheduler.client.report [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 717.593022] env[59534]: DEBUG oslo_concurrency.lockutils [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.301s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 717.593022] env[59534]: ERROR nova.compute.manager [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 69c955c3-52ad-4fe7-a327-c43a02064bd0, please check neutron logs for more information. [ 717.593022] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] Traceback (most recent call last): [ 717.593022] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 717.593022] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] self.driver.spawn(context, instance, image_meta, [ 717.593022] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 717.593022] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] self._vmops.spawn(context, instance, image_meta, injected_files, [ 717.593022] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 717.593022] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] vm_ref = self.build_virtual_machine(instance, [ 717.593337] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 717.593337] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] vif_infos = vmwarevif.get_vif_info(self._session, [ 717.593337] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 717.593337] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] for vif in network_info: [ 717.593337] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 717.593337] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] return self._sync_wrapper(fn, *args, **kwargs) [ 717.593337] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 717.593337] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] self.wait() [ 717.593337] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 717.593337] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] self[:] = self._gt.wait() [ 717.593337] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 717.593337] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] return self._exit_event.wait() [ 717.593337] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 717.593337] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] result = hub.switch() [ 717.593659] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 717.593659] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] return self.greenlet.switch() [ 717.593659] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 717.593659] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] result = function(*args, **kwargs) [ 717.593659] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 717.593659] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] return func(*args, **kwargs) [ 717.593659] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 717.593659] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] raise e [ 717.593659] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 717.593659] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] nwinfo = self.network_api.allocate_for_instance( [ 717.593659] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 717.593659] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] created_port_ids = self._update_ports_for_instance( [ 717.593659] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 717.593960] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] with excutils.save_and_reraise_exception(): [ 717.593960] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 717.593960] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] self.force_reraise() [ 717.593960] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 717.593960] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] raise self.value [ 717.593960] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 717.593960] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] updated_port = self._update_port( [ 717.593960] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 717.593960] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] _ensure_no_port_binding_failure(port) [ 717.593960] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 717.593960] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] raise exception.PortBindingFailed(port_id=port['id']) [ 717.593960] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] nova.exception.PortBindingFailed: Binding failed for port 69c955c3-52ad-4fe7-a327-c43a02064bd0, please check neutron logs for more information. [ 717.593960] env[59534]: ERROR nova.compute.manager [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] [ 717.594275] env[59534]: DEBUG nova.compute.utils [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] Binding failed for port 69c955c3-52ad-4fe7-a327-c43a02064bd0, please check neutron logs for more information. {{(pid=59534) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 717.604394] env[59534]: DEBUG nova.compute.manager [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] Build of instance 68f534dd-7119-48f4-85ae-14bfdf68d486 was re-scheduled: Binding failed for port 69c955c3-52ad-4fe7-a327-c43a02064bd0, please check neutron logs for more information. {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 717.604394] env[59534]: DEBUG nova.compute.manager [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] Unplugging VIFs for instance {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 717.604524] env[59534]: DEBUG oslo_concurrency.lockutils [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] Acquiring lock "refresh_cache-68f534dd-7119-48f4-85ae-14bfdf68d486" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 717.604683] env[59534]: DEBUG oslo_concurrency.lockutils [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] Acquired lock "refresh_cache-68f534dd-7119-48f4-85ae-14bfdf68d486" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 717.604799] env[59534]: DEBUG nova.network.neutron [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 717.612212] env[59534]: DEBUG oslo_concurrency.lockutils [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 717.612412] env[59534]: DEBUG oslo_concurrency.lockutils [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 717.613986] env[59534]: INFO nova.compute.claims [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 717.678921] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Acquiring lock "1d6fb105-7087-4bdf-9b1c-b194baf39a55" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 717.679187] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Lock "1d6fb105-7087-4bdf-9b1c-b194baf39a55" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 717.691444] env[59534]: DEBUG nova.compute.manager [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] Starting instance... {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 717.694724] env[59534]: DEBUG nova.network.neutron [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 717.749048] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 717.791881] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa0c3aaa-23cc-4a0f-a65d-f6add265831b {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.800143] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7adc8be2-711f-4530-bd30-bd18eaac2cda {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.833505] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2175e4a-26b5-4617-9d18-af634d05ce9f {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.841469] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53aca6a3-6761-4669-8298-939f471ed9e5 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.855488] env[59534]: DEBUG nova.compute.provider_tree [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 717.863500] env[59534]: DEBUG nova.scheduler.client.report [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 717.877558] env[59534]: DEBUG oslo_concurrency.lockutils [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.265s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 717.878236] env[59534]: DEBUG nova.compute.manager [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] Start building networks asynchronously for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 717.880570] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.132s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 717.881944] env[59534]: INFO nova.compute.claims [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 717.916065] env[59534]: DEBUG nova.compute.utils [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] Using /dev/sd instead of None {{(pid=59534) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 717.919452] env[59534]: DEBUG nova.compute.manager [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] Allocating IP information in the background. {{(pid=59534) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 717.919667] env[59534]: DEBUG nova.network.neutron [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] allocate_for_instance() {{(pid=59534) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 717.926029] env[59534]: DEBUG nova.compute.manager [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] Start building block device mappings for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 718.005256] env[59534]: DEBUG nova.compute.manager [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] Start spawning the instance on the hypervisor. {{(pid=59534) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 718.029606] env[59534]: DEBUG nova.virt.hardware [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T19:44:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T19:44:03Z,direct_url=,disk_format='vmdk',id=ca8542a2-3ba7-4624-b2be-cd49a340ac21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='31a329da9a7b4c98a2734a8492bf7dec',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T19:44:04Z,virtual_size=,visibility=), allow threads: False {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 718.029927] env[59534]: DEBUG nova.virt.hardware [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] Flavor limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 718.030098] env[59534]: DEBUG nova.virt.hardware [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] Image limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 718.030281] env[59534]: DEBUG nova.virt.hardware [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] Flavor pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 718.030454] env[59534]: DEBUG nova.virt.hardware [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] Image pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 718.030570] env[59534]: DEBUG nova.virt.hardware [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 718.030775] env[59534]: DEBUG nova.virt.hardware [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 718.030927] env[59534]: DEBUG nova.virt.hardware [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 718.031213] env[59534]: DEBUG nova.virt.hardware [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] Got 1 possible topologies {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 718.031435] env[59534]: DEBUG nova.virt.hardware [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 718.031662] env[59534]: DEBUG nova.virt.hardware [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 718.032562] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18686eab-2554-40bd-b2e3-39b4bdbc6ce8 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.044419] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2a7bf55-00f1-4545-be52-9326de7098b6 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.077425] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e49c12e8-2613-41e7-a49e-1c1011dec021 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.084628] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a5f3086-a1a4-4382-867f-e9817f410c27 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.116070] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3635ee8-b371-4d1d-af3b-67b2af5cf535 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.127287] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7373f7a5-2a93-4126-82f4-7350c19cd68c {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.142871] env[59534]: DEBUG nova.compute.provider_tree [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 718.153534] env[59534]: DEBUG nova.scheduler.client.report [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 718.170886] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.290s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 718.171538] env[59534]: DEBUG nova.compute.manager [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] Start building networks asynchronously for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 718.175401] env[59534]: DEBUG nova.policy [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '35fffa3ef1b94ff691c83c6725541463', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3fdbb5e7bc98478b8346c4de35a059f3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59534) authorize /opt/stack/nova/nova/policy.py:203}} [ 718.235848] env[59534]: DEBUG nova.compute.utils [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Using /dev/sd instead of None {{(pid=59534) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 718.235848] env[59534]: DEBUG nova.compute.manager [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] Not allocating networking since 'none' was specified. {{(pid=59534) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 718.245755] env[59534]: DEBUG nova.compute.manager [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] Start building block device mappings for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 718.313211] env[59534]: DEBUG nova.compute.manager [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] Start spawning the instance on the hypervisor. {{(pid=59534) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 718.338704] env[59534]: DEBUG nova.virt.hardware [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T19:44:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T19:44:03Z,direct_url=,disk_format='vmdk',id=ca8542a2-3ba7-4624-b2be-cd49a340ac21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='31a329da9a7b4c98a2734a8492bf7dec',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T19:44:04Z,virtual_size=,visibility=), allow threads: False {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 718.339126] env[59534]: DEBUG nova.virt.hardware [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Flavor limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 718.339379] env[59534]: DEBUG nova.virt.hardware [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Image limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 718.339602] env[59534]: DEBUG nova.virt.hardware [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Flavor pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 718.339878] env[59534]: DEBUG nova.virt.hardware [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Image pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 718.340087] env[59534]: DEBUG nova.virt.hardware [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 718.340347] env[59534]: DEBUG nova.virt.hardware [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 718.340548] env[59534]: DEBUG nova.virt.hardware [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 718.341115] env[59534]: DEBUG nova.virt.hardware [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Got 1 possible topologies {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 718.341115] env[59534]: DEBUG nova.virt.hardware [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 718.341115] env[59534]: DEBUG nova.virt.hardware [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 718.342118] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3a83b3a-00c7-4a9f-a8a6-f42e7cf6ef78 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.352065] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-735cc839-a60e-4f83-a8bd-c92f2404231e {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.367415] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] Instance VIF info [] {{(pid=59534) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 718.373372] env[59534]: DEBUG oslo.service.loopingcall [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59534) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 718.373888] env[59534]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] Creating VM on the ESX host {{(pid=59534) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 718.373888] env[59534]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e609d644-ff4e-4520-9019-d8703a3e5032 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.394402] env[59534]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 718.394402] env[59534]: value = "task-1308572" [ 718.394402] env[59534]: _type = "Task" [ 718.394402] env[59534]: } to complete. {{(pid=59534) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 718.402981] env[59534]: DEBUG oslo_vmware.api [-] Task: {'id': task-1308572, 'name': CreateVM_Task} progress is 0%. {{(pid=59534) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 718.490994] env[59534]: ERROR nova.compute.manager [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 938a7aad-7df9-4af1-bd70-4d4ab4f7363b, please check neutron logs for more information. [ 718.490994] env[59534]: ERROR nova.compute.manager Traceback (most recent call last): [ 718.490994] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 718.490994] env[59534]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 718.490994] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 718.490994] env[59534]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 718.490994] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 718.490994] env[59534]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 718.490994] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 718.490994] env[59534]: ERROR nova.compute.manager self.force_reraise() [ 718.490994] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 718.490994] env[59534]: ERROR nova.compute.manager raise self.value [ 718.490994] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 718.490994] env[59534]: ERROR nova.compute.manager updated_port = self._update_port( [ 718.490994] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 718.490994] env[59534]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 718.491679] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 718.491679] env[59534]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 718.491679] env[59534]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 938a7aad-7df9-4af1-bd70-4d4ab4f7363b, please check neutron logs for more information. [ 718.491679] env[59534]: ERROR nova.compute.manager [ 718.492784] env[59534]: Traceback (most recent call last): [ 718.492784] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 718.492784] env[59534]: listener.cb(fileno) [ 718.492784] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 718.492784] env[59534]: result = function(*args, **kwargs) [ 718.492784] env[59534]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 718.492784] env[59534]: return func(*args, **kwargs) [ 718.492784] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 718.492784] env[59534]: raise e [ 718.492784] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 718.492784] env[59534]: nwinfo = self.network_api.allocate_for_instance( [ 718.492784] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 718.492784] env[59534]: created_port_ids = self._update_ports_for_instance( [ 718.492784] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 718.492784] env[59534]: with excutils.save_and_reraise_exception(): [ 718.492784] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 718.492784] env[59534]: self.force_reraise() [ 718.492784] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 718.492784] env[59534]: raise self.value [ 718.492784] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 718.492784] env[59534]: updated_port = self._update_port( [ 718.492784] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 718.492784] env[59534]: _ensure_no_port_binding_failure(port) [ 718.492784] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 718.492784] env[59534]: raise exception.PortBindingFailed(port_id=port['id']) [ 718.492784] env[59534]: nova.exception.PortBindingFailed: Binding failed for port 938a7aad-7df9-4af1-bd70-4d4ab4f7363b, please check neutron logs for more information. [ 718.492784] env[59534]: Removing descriptor: 21 [ 718.493826] env[59534]: ERROR nova.compute.manager [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 938a7aad-7df9-4af1-bd70-4d4ab4f7363b, please check neutron logs for more information. [ 718.493826] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] Traceback (most recent call last): [ 718.493826] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 718.493826] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] yield resources [ 718.493826] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 718.493826] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] self.driver.spawn(context, instance, image_meta, [ 718.493826] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 718.493826] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 718.493826] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 718.493826] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] vm_ref = self.build_virtual_machine(instance, [ 718.493826] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 718.494114] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] vif_infos = vmwarevif.get_vif_info(self._session, [ 718.494114] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 718.494114] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] for vif in network_info: [ 718.494114] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 718.494114] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] return self._sync_wrapper(fn, *args, **kwargs) [ 718.494114] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 718.494114] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] self.wait() [ 718.494114] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 718.494114] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] self[:] = self._gt.wait() [ 718.494114] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 718.494114] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] return self._exit_event.wait() [ 718.494114] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 718.494114] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] result = hub.switch() [ 718.494114] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 718.494431] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] return self.greenlet.switch() [ 718.494431] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 718.494431] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] result = function(*args, **kwargs) [ 718.494431] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 718.494431] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] return func(*args, **kwargs) [ 718.494431] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 718.494431] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] raise e [ 718.494431] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 718.494431] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] nwinfo = self.network_api.allocate_for_instance( [ 718.494431] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 718.494431] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] created_port_ids = self._update_ports_for_instance( [ 718.494431] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 718.494431] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] with excutils.save_and_reraise_exception(): [ 718.496835] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 718.496835] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] self.force_reraise() [ 718.496835] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 718.496835] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] raise self.value [ 718.496835] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 718.496835] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] updated_port = self._update_port( [ 718.496835] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 718.496835] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] _ensure_no_port_binding_failure(port) [ 718.496835] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 718.496835] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] raise exception.PortBindingFailed(port_id=port['id']) [ 718.496835] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] nova.exception.PortBindingFailed: Binding failed for port 938a7aad-7df9-4af1-bd70-4d4ab4f7363b, please check neutron logs for more information. [ 718.496835] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] [ 718.498236] env[59534]: INFO nova.compute.manager [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] Terminating instance [ 718.499303] env[59534]: DEBUG oslo_concurrency.lockutils [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Acquiring lock "refresh_cache-fcdac100-e2fe-434e-87ca-0a174ecfb0e6" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 718.499303] env[59534]: DEBUG oslo_concurrency.lockutils [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Acquired lock "refresh_cache-fcdac100-e2fe-434e-87ca-0a174ecfb0e6" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 718.499421] env[59534]: DEBUG nova.network.neutron [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 718.527756] env[59534]: DEBUG oslo_concurrency.lockutils [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Acquiring lock "f1e315bf-9348-4631-af77-85ec3a986a83" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 718.527846] env[59534]: DEBUG oslo_concurrency.lockutils [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Lock "f1e315bf-9348-4631-af77-85ec3a986a83" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 718.538448] env[59534]: DEBUG nova.compute.manager [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: f1e315bf-9348-4631-af77-85ec3a986a83] Starting instance... {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 718.595351] env[59534]: DEBUG oslo_concurrency.lockutils [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 718.595597] env[59534]: DEBUG oslo_concurrency.lockutils [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 718.597389] env[59534]: INFO nova.compute.claims [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: f1e315bf-9348-4631-af77-85ec3a986a83] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 718.618246] env[59534]: DEBUG nova.network.neutron [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 718.625235] env[59534]: DEBUG oslo_concurrency.lockutils [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] Releasing lock "refresh_cache-68f534dd-7119-48f4-85ae-14bfdf68d486" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 718.625547] env[59534]: DEBUG nova.compute.manager [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 718.625787] env[59534]: DEBUG nova.compute.manager [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 718.626137] env[59534]: DEBUG nova.network.neutron [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 718.728040] env[59534]: DEBUG nova.network.neutron [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 718.731560] env[59534]: DEBUG nova.network.neutron [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 718.741058] env[59534]: DEBUG nova.network.neutron [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 718.749636] env[59534]: INFO nova.compute.manager [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] [instance: 68f534dd-7119-48f4-85ae-14bfdf68d486] Took 0.12 seconds to deallocate network for instance. [ 718.777069] env[59534]: ERROR nova.compute.manager [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 2e9d9d04-ebae-4284-8649-586e770f1bfd, please check neutron logs for more information. [ 718.777069] env[59534]: ERROR nova.compute.manager Traceback (most recent call last): [ 718.777069] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 718.777069] env[59534]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 718.777069] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 718.777069] env[59534]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 718.777069] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 718.777069] env[59534]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 718.777069] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 718.777069] env[59534]: ERROR nova.compute.manager self.force_reraise() [ 718.777069] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 718.777069] env[59534]: ERROR nova.compute.manager raise self.value [ 718.777069] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 718.777069] env[59534]: ERROR nova.compute.manager updated_port = self._update_port( [ 718.777069] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 718.777069] env[59534]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 718.777516] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 718.777516] env[59534]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 718.777516] env[59534]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 2e9d9d04-ebae-4284-8649-586e770f1bfd, please check neutron logs for more information. [ 718.777516] env[59534]: ERROR nova.compute.manager [ 718.777516] env[59534]: Traceback (most recent call last): [ 718.777516] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 718.777516] env[59534]: listener.cb(fileno) [ 718.777516] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 718.777516] env[59534]: result = function(*args, **kwargs) [ 718.777516] env[59534]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 718.777516] env[59534]: return func(*args, **kwargs) [ 718.777516] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 718.777516] env[59534]: raise e [ 718.777516] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 718.777516] env[59534]: nwinfo = self.network_api.allocate_for_instance( [ 718.777516] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 718.777516] env[59534]: created_port_ids = self._update_ports_for_instance( [ 718.777516] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 718.777516] env[59534]: with excutils.save_and_reraise_exception(): [ 718.777516] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 718.777516] env[59534]: self.force_reraise() [ 718.777516] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 718.777516] env[59534]: raise self.value [ 718.777516] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 718.777516] env[59534]: updated_port = self._update_port( [ 718.777516] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 718.777516] env[59534]: _ensure_no_port_binding_failure(port) [ 718.777516] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 718.777516] env[59534]: raise exception.PortBindingFailed(port_id=port['id']) [ 718.778214] env[59534]: nova.exception.PortBindingFailed: Binding failed for port 2e9d9d04-ebae-4284-8649-586e770f1bfd, please check neutron logs for more information. [ 718.778214] env[59534]: Removing descriptor: 12 [ 718.778214] env[59534]: ERROR nova.compute.manager [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 2e9d9d04-ebae-4284-8649-586e770f1bfd, please check neutron logs for more information. [ 718.778214] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] Traceback (most recent call last): [ 718.778214] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 718.778214] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] yield resources [ 718.778214] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 718.778214] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] self.driver.spawn(context, instance, image_meta, [ 718.778214] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 718.778214] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 718.778214] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 718.778214] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] vm_ref = self.build_virtual_machine(instance, [ 718.778599] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 718.778599] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] vif_infos = vmwarevif.get_vif_info(self._session, [ 718.778599] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 718.778599] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] for vif in network_info: [ 718.778599] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 718.778599] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] return self._sync_wrapper(fn, *args, **kwargs) [ 718.778599] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 718.778599] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] self.wait() [ 718.778599] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 718.778599] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] self[:] = self._gt.wait() [ 718.778599] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 718.778599] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] return self._exit_event.wait() [ 718.778599] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 718.778599] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] result = hub.switch() [ 718.779017] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 718.779017] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] return self.greenlet.switch() [ 718.779017] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 718.779017] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] result = function(*args, **kwargs) [ 718.779017] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 718.779017] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] return func(*args, **kwargs) [ 718.779017] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 718.779017] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] raise e [ 718.779017] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 718.779017] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] nwinfo = self.network_api.allocate_for_instance( [ 718.779017] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 718.779017] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] created_port_ids = self._update_ports_for_instance( [ 718.779017] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 718.779407] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] with excutils.save_and_reraise_exception(): [ 718.779407] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 718.779407] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] self.force_reraise() [ 718.779407] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 718.779407] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] raise self.value [ 718.779407] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 718.779407] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] updated_port = self._update_port( [ 718.779407] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 718.779407] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] _ensure_no_port_binding_failure(port) [ 718.779407] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 718.779407] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] raise exception.PortBindingFailed(port_id=port['id']) [ 718.779407] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] nova.exception.PortBindingFailed: Binding failed for port 2e9d9d04-ebae-4284-8649-586e770f1bfd, please check neutron logs for more information. [ 718.779407] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] [ 718.779753] env[59534]: INFO nova.compute.manager [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] Terminating instance [ 718.781046] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Acquiring lock "refresh_cache-8ebf1e27-6cc8-4122-b8b1-ed0585507bdb" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 718.781046] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Acquired lock "refresh_cache-8ebf1e27-6cc8-4122-b8b1-ed0585507bdb" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 718.781046] env[59534]: DEBUG nova.network.neutron [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 718.825384] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc1baacc-5135-4496-90ef-59ddc8d4dafa {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.839996] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db01301f-fb1d-4836-8950-4948bc312d3f {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.871361] env[59534]: DEBUG nova.network.neutron [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 718.874134] env[59534]: INFO nova.scheduler.client.report [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] Deleted allocations for instance 68f534dd-7119-48f4-85ae-14bfdf68d486 [ 718.879914] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e95b7d3d-a949-484d-ad94-8045479f9309 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.890732] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b4c0791-df9d-4170-8900-b1e879f377ec {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.903034] env[59534]: DEBUG oslo_vmware.api [-] Task: {'id': task-1308572, 'name': CreateVM_Task, 'duration_secs': 0.257111} completed successfully. {{(pid=59534) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 718.911734] env[59534]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] Created VM on the ESX host {{(pid=59534) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 718.912118] env[59534]: DEBUG nova.compute.provider_tree [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 718.913685] env[59534]: DEBUG oslo_concurrency.lockutils [None req-d5c690c5-8615-45b1-bbc9-dde29baf475c tempest-ServerActionsTestJSON-634471831 tempest-ServerActionsTestJSON-634471831-project-member] Lock "68f534dd-7119-48f4-85ae-14bfdf68d486" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 17.671s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 718.913906] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/ca8542a2-3ba7-4624-b2be-cd49a340ac21" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 718.914213] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Acquired lock "[datastore1] devstack-image-cache_base/ca8542a2-3ba7-4624-b2be-cd49a340ac21" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 718.914416] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/ca8542a2-3ba7-4624-b2be-cd49a340ac21" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 718.914846] env[59534]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-090cb072-6629-4146-83a8-9fed647b36a8 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.920704] env[59534]: DEBUG oslo_vmware.api [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Waiting for the task: (returnval){ [ 718.920704] env[59534]: value = "session[529ab406-b1c9-3c06-ce96-9015eeabf2c3]52a7d72f-388c-4076-da6e-8c82618ad4ff" [ 718.920704] env[59534]: _type = "Task" [ 718.920704] env[59534]: } to complete. {{(pid=59534) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 718.925169] env[59534]: DEBUG nova.scheduler.client.report [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 718.932427] env[59534]: DEBUG oslo_vmware.api [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Task: {'id': session[529ab406-b1c9-3c06-ce96-9015eeabf2c3]52a7d72f-388c-4076-da6e-8c82618ad4ff, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59534) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 718.939501] env[59534]: DEBUG oslo_concurrency.lockutils [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.344s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 718.940134] env[59534]: DEBUG nova.compute.manager [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: f1e315bf-9348-4631-af77-85ec3a986a83] Start building networks asynchronously for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 718.980294] env[59534]: DEBUG nova.compute.utils [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Using /dev/sd instead of None {{(pid=59534) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 718.981747] env[59534]: DEBUG nova.compute.manager [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: f1e315bf-9348-4631-af77-85ec3a986a83] Allocating IP information in the background. {{(pid=59534) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 718.981910] env[59534]: DEBUG nova.network.neutron [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: f1e315bf-9348-4631-af77-85ec3a986a83] allocate_for_instance() {{(pid=59534) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 718.992565] env[59534]: DEBUG nova.compute.manager [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: f1e315bf-9348-4631-af77-85ec3a986a83] Start building block device mappings for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 719.071243] env[59534]: DEBUG nova.compute.manager [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: f1e315bf-9348-4631-af77-85ec3a986a83] Start spawning the instance on the hypervisor. {{(pid=59534) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 719.103715] env[59534]: DEBUG nova.virt.hardware [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T19:44:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T19:44:03Z,direct_url=,disk_format='vmdk',id=ca8542a2-3ba7-4624-b2be-cd49a340ac21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='31a329da9a7b4c98a2734a8492bf7dec',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T19:44:04Z,virtual_size=,visibility=), allow threads: False {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 719.104525] env[59534]: DEBUG nova.virt.hardware [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Flavor limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 719.104525] env[59534]: DEBUG nova.virt.hardware [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Image limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 719.104525] env[59534]: DEBUG nova.virt.hardware [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Flavor pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 719.104525] env[59534]: DEBUG nova.virt.hardware [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Image pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 719.104731] env[59534]: DEBUG nova.virt.hardware [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 719.104760] env[59534]: DEBUG nova.virt.hardware [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 719.105342] env[59534]: DEBUG nova.virt.hardware [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 719.105342] env[59534]: DEBUG nova.virt.hardware [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Got 1 possible topologies {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 719.105342] env[59534]: DEBUG nova.virt.hardware [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 719.105481] env[59534]: DEBUG nova.virt.hardware [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 719.106705] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20886d3b-7868-4b9f-8913-c1b255317085 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.115863] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1475c257-410a-41f4-bbf7-b193484197e6 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.432651] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Releasing lock "[datastore1] devstack-image-cache_base/ca8542a2-3ba7-4624-b2be-cd49a340ac21" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 719.432923] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] Processing image ca8542a2-3ba7-4624-b2be-cd49a340ac21 {{(pid=59534) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 719.433549] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/ca8542a2-3ba7-4624-b2be-cd49a340ac21/ca8542a2-3ba7-4624-b2be-cd49a340ac21.vmdk" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 719.469134] env[59534]: DEBUG nova.network.neutron [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 719.484147] env[59534]: DEBUG oslo_concurrency.lockutils [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Releasing lock "refresh_cache-fcdac100-e2fe-434e-87ca-0a174ecfb0e6" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 719.484147] env[59534]: DEBUG nova.compute.manager [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] Start destroying the instance on the hypervisor. {{(pid=59534) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 719.485665] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] Destroying instance {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 719.485665] env[59534]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-25b8d3a5-2183-472b-959f-0767e0e3fc6a {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.490073] env[59534]: DEBUG nova.network.neutron [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 719.503797] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a240693-e922-4e33-80d6-cdabdd6903cd {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.516229] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Releasing lock "refresh_cache-8ebf1e27-6cc8-4122-b8b1-ed0585507bdb" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 719.516421] env[59534]: DEBUG nova.compute.manager [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] Start destroying the instance on the hypervisor. {{(pid=59534) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 719.516758] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] Destroying instance {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 719.517524] env[59534]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-7bc2e29f-85c6-453d-a486-8201cfd03ca8 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.523649] env[59534]: DEBUG nova.policy [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f1de811c49e4475e8bebd9420cb053e3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f07a595f0d54471b9b09e9b1b9b0b5a5', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59534) authorize /opt/stack/nova/nova/policy.py:203}} [ 719.532708] env[59534]: WARNING nova.virt.vmwareapi.vmops [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance fcdac100-e2fe-434e-87ca-0a174ecfb0e6 could not be found. [ 719.533140] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] Instance destroyed {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 719.533334] env[59534]: INFO nova.compute.manager [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] Took 0.05 seconds to destroy the instance on the hypervisor. [ 719.534027] env[59534]: DEBUG oslo.service.loopingcall [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59534) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 719.534239] env[59534]: DEBUG nova.compute.manager [-] [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 719.534239] env[59534]: DEBUG nova.network.neutron [-] [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 719.540263] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7301ab6-5677-4445-b853-0ed2c31ef252 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.568068] env[59534]: WARNING nova.virt.vmwareapi.vmops [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb could not be found. [ 719.568068] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] Instance destroyed {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 719.568169] env[59534]: INFO nova.compute.manager [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] Took 0.05 seconds to destroy the instance on the hypervisor. [ 719.568408] env[59534]: DEBUG oslo.service.loopingcall [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59534) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 719.568617] env[59534]: DEBUG nova.compute.manager [-] [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 719.568707] env[59534]: DEBUG nova.network.neutron [-] [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 719.659762] env[59534]: DEBUG nova.network.neutron [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] Successfully created port: 3feb5ed9-5faf-43db-b885-5f6af87fd8eb {{(pid=59534) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 719.667295] env[59534]: DEBUG nova.network.neutron [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] Successfully created port: b536cc08-98fb-4ed0-969d-32accca2b5e2 {{(pid=59534) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 719.667295] env[59534]: DEBUG nova.network.neutron [-] [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 719.678060] env[59534]: DEBUG nova.network.neutron [-] [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 719.690114] env[59534]: INFO nova.compute.manager [-] [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] Took 0.12 seconds to deallocate network for instance. [ 719.697830] env[59534]: DEBUG nova.compute.claims [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] Aborting claim: {{(pid=59534) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 719.697830] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 719.697830] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 719.910487] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8bdcb842-6c47-4219-8598-0047d3898484 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.919602] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86133e3f-1dae-44fb-a756-916add157596 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.956642] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f351ff4-2466-49cc-95b1-12ffd9c68ff2 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.965468] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-463b21e3-6f37-42dd-8f2c-99da6d4f96c7 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.982574] env[59534]: DEBUG nova.compute.provider_tree [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 719.994665] env[59534]: DEBUG nova.scheduler.client.report [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 720.020368] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.320s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 720.020368] env[59534]: ERROR nova.compute.manager [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 2e9d9d04-ebae-4284-8649-586e770f1bfd, please check neutron logs for more information. [ 720.020368] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] Traceback (most recent call last): [ 720.020368] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 720.020368] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] self.driver.spawn(context, instance, image_meta, [ 720.020368] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 720.020368] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 720.020368] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 720.020368] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] vm_ref = self.build_virtual_machine(instance, [ 720.020752] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 720.020752] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] vif_infos = vmwarevif.get_vif_info(self._session, [ 720.020752] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 720.020752] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] for vif in network_info: [ 720.020752] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 720.020752] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] return self._sync_wrapper(fn, *args, **kwargs) [ 720.020752] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 720.020752] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] self.wait() [ 720.020752] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 720.020752] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] self[:] = self._gt.wait() [ 720.020752] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 720.020752] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] return self._exit_event.wait() [ 720.020752] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 720.020752] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] result = hub.switch() [ 720.021367] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 720.021367] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] return self.greenlet.switch() [ 720.021367] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 720.021367] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] result = function(*args, **kwargs) [ 720.021367] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 720.021367] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] return func(*args, **kwargs) [ 720.021367] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 720.021367] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] raise e [ 720.021367] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 720.021367] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] nwinfo = self.network_api.allocate_for_instance( [ 720.021367] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 720.021367] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] created_port_ids = self._update_ports_for_instance( [ 720.021367] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 720.021766] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] with excutils.save_and_reraise_exception(): [ 720.021766] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 720.021766] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] self.force_reraise() [ 720.021766] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 720.021766] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] raise self.value [ 720.021766] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 720.021766] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] updated_port = self._update_port( [ 720.021766] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 720.021766] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] _ensure_no_port_binding_failure(port) [ 720.021766] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 720.021766] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] raise exception.PortBindingFailed(port_id=port['id']) [ 720.021766] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] nova.exception.PortBindingFailed: Binding failed for port 2e9d9d04-ebae-4284-8649-586e770f1bfd, please check neutron logs for more information. [ 720.021766] env[59534]: ERROR nova.compute.manager [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] [ 720.022085] env[59534]: DEBUG nova.compute.utils [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] Binding failed for port 2e9d9d04-ebae-4284-8649-586e770f1bfd, please check neutron logs for more information. {{(pid=59534) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 720.025199] env[59534]: DEBUG nova.compute.manager [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] Build of instance 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb was re-scheduled: Binding failed for port 2e9d9d04-ebae-4284-8649-586e770f1bfd, please check neutron logs for more information. {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 720.025199] env[59534]: DEBUG nova.compute.manager [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] Unplugging VIFs for instance {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 720.025199] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Acquiring lock "refresh_cache-8ebf1e27-6cc8-4122-b8b1-ed0585507bdb" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 720.025199] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Acquired lock "refresh_cache-8ebf1e27-6cc8-4122-b8b1-ed0585507bdb" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 720.025443] env[59534]: DEBUG nova.network.neutron [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 720.065882] env[59534]: DEBUG nova.network.neutron [-] [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 720.134252] env[59534]: DEBUG nova.network.neutron [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 720.203915] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Acquiring lock "f4a120d9-3d50-4905-bcac-c6b632fa0295" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 720.204214] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Lock "f4a120d9-3d50-4905-bcac-c6b632fa0295" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 720.217873] env[59534]: DEBUG nova.compute.manager [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] Starting instance... {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 720.251973] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Acquiring lock "de33d275-2cf8-4726-9d1a-526d7556a74f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 720.252205] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Lock "de33d275-2cf8-4726-9d1a-526d7556a74f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 720.270592] env[59534]: DEBUG nova.compute.manager [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] Starting instance... {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 720.290098] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 720.290098] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 720.291521] env[59534]: INFO nova.compute.claims [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 720.338850] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 720.534987] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-64bc0142-d63d-46e0-9284-39f41a10f0aa {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.543060] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7c459e4-3ce3-44f6-868e-ec6e3d9454e8 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.572033] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-061d5d30-9995-42ee-a0db-0e98e2b2b3bd {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.579659] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5ae2a0a-a644-465e-9ff8-bacbb9246956 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.594496] env[59534]: DEBUG nova.compute.provider_tree [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 720.605831] env[59534]: DEBUG nova.scheduler.client.report [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 720.620208] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.330s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 720.620686] env[59534]: DEBUG nova.compute.manager [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] Start building networks asynchronously for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 720.623194] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.284s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 720.628020] env[59534]: INFO nova.compute.claims [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 720.697131] env[59534]: DEBUG nova.compute.utils [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Using /dev/sd instead of None {{(pid=59534) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 720.698402] env[59534]: DEBUG nova.compute.manager [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] Allocating IP information in the background. {{(pid=59534) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 720.699021] env[59534]: DEBUG nova.network.neutron [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] allocate_for_instance() {{(pid=59534) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 720.712187] env[59534]: DEBUG nova.compute.manager [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] Start building block device mappings for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 720.800099] env[59534]: DEBUG nova.compute.manager [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] Start spawning the instance on the hypervisor. {{(pid=59534) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 720.829977] env[59534]: DEBUG nova.virt.hardware [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T19:44:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T19:44:03Z,direct_url=,disk_format='vmdk',id=ca8542a2-3ba7-4624-b2be-cd49a340ac21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='31a329da9a7b4c98a2734a8492bf7dec',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T19:44:04Z,virtual_size=,visibility=), allow threads: False {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 720.829977] env[59534]: DEBUG nova.virt.hardware [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Flavor limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 720.830169] env[59534]: DEBUG nova.virt.hardware [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Image limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 720.830385] env[59534]: DEBUG nova.virt.hardware [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Flavor pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 720.830443] env[59534]: DEBUG nova.virt.hardware [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Image pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 720.830572] env[59534]: DEBUG nova.virt.hardware [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 720.830773] env[59534]: DEBUG nova.virt.hardware [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 720.830954] env[59534]: DEBUG nova.virt.hardware [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 720.831093] env[59534]: DEBUG nova.virt.hardware [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Got 1 possible topologies {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 720.831293] env[59534]: DEBUG nova.virt.hardware [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 720.831477] env[59534]: DEBUG nova.virt.hardware [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 720.832435] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94ff5e86-d3e5-4084-8999-956f497e0dbb {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.847206] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac006d4c-2a27-4044-8a3a-bd48053e6140 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.880276] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1a3c9a3-7077-4937-909e-9da63ca3c8df {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.888730] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-612059d8-a3ce-4af8-8bc9-7769f19cd6d5 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.920495] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-54fbf88a-e793-485f-a789-018e274ebb3a {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.927993] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5003ac0-a956-4500-b601-8b0f87e31dbe {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.933809] env[59534]: DEBUG nova.network.neutron [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 720.948508] env[59534]: DEBUG nova.compute.provider_tree [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 720.954243] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Releasing lock "refresh_cache-8ebf1e27-6cc8-4122-b8b1-ed0585507bdb" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 720.954558] env[59534]: DEBUG nova.compute.manager [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 720.954815] env[59534]: DEBUG nova.compute.manager [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 720.955083] env[59534]: DEBUG nova.network.neutron [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 720.965449] env[59534]: DEBUG nova.scheduler.client.report [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 720.981028] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.357s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 720.981113] env[59534]: DEBUG nova.compute.manager [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] Start building networks asynchronously for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 720.992594] env[59534]: DEBUG nova.policy [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '00bb3812655e400ea181ea219b316f42', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e63775eb8bc34a52a2b1e79624c417e8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59534) authorize /opt/stack/nova/nova/policy.py:203}} [ 721.028245] env[59534]: DEBUG nova.compute.utils [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Using /dev/sd instead of None {{(pid=59534) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 721.030341] env[59534]: DEBUG nova.compute.manager [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] Allocating IP information in the background. {{(pid=59534) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 721.030582] env[59534]: DEBUG nova.network.neutron [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] allocate_for_instance() {{(pid=59534) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 721.034528] env[59534]: DEBUG nova.network.neutron [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 721.041832] env[59534]: DEBUG nova.compute.manager [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] Start building block device mappings for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 721.045847] env[59534]: DEBUG nova.network.neutron [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 721.057700] env[59534]: INFO nova.compute.manager [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb] Took 0.10 seconds to deallocate network for instance. [ 721.160972] env[59534]: DEBUG nova.compute.manager [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] Start spawning the instance on the hypervisor. {{(pid=59534) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 721.192681] env[59534]: DEBUG nova.virt.hardware [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T19:44:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T19:44:03Z,direct_url=,disk_format='vmdk',id=ca8542a2-3ba7-4624-b2be-cd49a340ac21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='31a329da9a7b4c98a2734a8492bf7dec',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T19:44:04Z,virtual_size=,visibility=), allow threads: False {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 721.192880] env[59534]: DEBUG nova.virt.hardware [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Flavor limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 721.194740] env[59534]: DEBUG nova.virt.hardware [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Image limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 721.195038] env[59534]: DEBUG nova.virt.hardware [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Flavor pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 721.195214] env[59534]: DEBUG nova.virt.hardware [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Image pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 721.195367] env[59534]: DEBUG nova.virt.hardware [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 721.195583] env[59534]: DEBUG nova.virt.hardware [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 721.195740] env[59534]: DEBUG nova.virt.hardware [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 721.195902] env[59534]: DEBUG nova.virt.hardware [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Got 1 possible topologies {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 721.196075] env[59534]: DEBUG nova.virt.hardware [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 721.196251] env[59534]: DEBUG nova.virt.hardware [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 721.198009] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74939a4f-8f32-429e-9411-af462c5c7f0f {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 721.206062] env[59534]: INFO nova.scheduler.client.report [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Deleted allocations for instance 8ebf1e27-6cc8-4122-b8b1-ed0585507bdb [ 721.220854] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6b8da49-433d-41ce-90b4-256a92182279 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 721.227831] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f871018d-e547-4ba6-b9af-693e9e080160 tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Lock "8ebf1e27-6cc8-4122-b8b1-ed0585507bdb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 9.912s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 721.307727] env[59534]: DEBUG nova.policy [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '00bb3812655e400ea181ea219b316f42', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e63775eb8bc34a52a2b1e79624c417e8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59534) authorize /opt/stack/nova/nova/policy.py:203}} [ 721.414024] env[59534]: DEBUG oslo_concurrency.lockutils [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Acquiring lock "f3febdda-db75-4048-a2b5-aa4daf2dcfa2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 721.414024] env[59534]: DEBUG oslo_concurrency.lockutils [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Lock "f3febdda-db75-4048-a2b5-aa4daf2dcfa2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 721.426160] env[59534]: DEBUG nova.compute.manager [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] Starting instance... {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 721.459328] env[59534]: DEBUG nova.network.neutron [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: f1e315bf-9348-4631-af77-85ec3a986a83] Successfully created port: 7a72d80e-8567-4d32-bb2b-d09fce05e08b {{(pid=59534) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 721.487232] env[59534]: DEBUG oslo_concurrency.lockutils [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 721.487464] env[59534]: DEBUG oslo_concurrency.lockutils [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 721.491656] env[59534]: INFO nova.compute.claims [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 721.709159] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-996c3d11-5eed-4990-b0b9-d0505e277e24 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 721.718627] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88faad6d-0602-4ef4-9ac2-d66aa3c47526 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 721.754661] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac7a8cc3-791f-488c-b439-8f3ba57b78c6 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 721.762832] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97e90a14-e6e2-4f78-a8b5-ac3459cdccdc {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 721.778247] env[59534]: DEBUG nova.compute.provider_tree [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 721.786995] env[59534]: DEBUG nova.scheduler.client.report [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 721.805154] env[59534]: DEBUG oslo_concurrency.lockutils [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.317s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 721.805824] env[59534]: DEBUG nova.compute.manager [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] Start building networks asynchronously for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 721.846223] env[59534]: DEBUG nova.compute.utils [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Using /dev/sd instead of None {{(pid=59534) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 721.847703] env[59534]: DEBUG nova.compute.manager [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] Allocating IP information in the background. {{(pid=59534) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 721.847973] env[59534]: DEBUG nova.network.neutron [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] allocate_for_instance() {{(pid=59534) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 721.857159] env[59534]: DEBUG nova.compute.manager [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] Start building block device mappings for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 721.941147] env[59534]: DEBUG nova.compute.manager [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] Start spawning the instance on the hypervisor. {{(pid=59534) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 721.967306] env[59534]: DEBUG nova.virt.hardware [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T19:44:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='84',id=12,is_public=True,memory_mb=192,name='m1.micro',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T19:44:03Z,direct_url=,disk_format='vmdk',id=ca8542a2-3ba7-4624-b2be-cd49a340ac21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='31a329da9a7b4c98a2734a8492bf7dec',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T19:44:04Z,virtual_size=,visibility=), allow threads: False {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 721.967588] env[59534]: DEBUG nova.virt.hardware [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Flavor limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 721.967748] env[59534]: DEBUG nova.virt.hardware [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Image limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 721.967922] env[59534]: DEBUG nova.virt.hardware [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Flavor pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 721.968074] env[59534]: DEBUG nova.virt.hardware [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Image pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 721.968939] env[59534]: DEBUG nova.virt.hardware [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 721.968939] env[59534]: DEBUG nova.virt.hardware [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 721.968939] env[59534]: DEBUG nova.virt.hardware [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 721.968939] env[59534]: DEBUG nova.virt.hardware [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Got 1 possible topologies {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 721.968939] env[59534]: DEBUG nova.virt.hardware [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 721.969188] env[59534]: DEBUG nova.virt.hardware [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 721.969893] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2b6a729-ed5b-41b8-9fcd-0653d33611c8 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 721.984846] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81f33253-1051-4f4f-ad12-7c85ae9a1079 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 722.094928] env[59534]: DEBUG nova.policy [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '159c147d52e0452c95dcb771fcd2309d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '02f55d9578bb4104ab4955671dfbff30', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59534) authorize /opt/stack/nova/nova/policy.py:203}} [ 722.119270] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] Acquiring lock "3a20b8f8-f10c-4303-b801-e0831da74f91" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 722.119524] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] Lock "3a20b8f8-f10c-4303-b801-e0831da74f91" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 722.136031] env[59534]: DEBUG nova.compute.manager [req-9cdc05de-de0c-4d59-b100-398fff9ac74a req-cba87ac2-15e5-473d-888e-883a22e4b77c service nova] [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] Received event network-changed-938a7aad-7df9-4af1-bd70-4d4ab4f7363b {{(pid=59534) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 722.136031] env[59534]: DEBUG nova.compute.manager [req-9cdc05de-de0c-4d59-b100-398fff9ac74a req-cba87ac2-15e5-473d-888e-883a22e4b77c service nova] [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] Refreshing instance network info cache due to event network-changed-938a7aad-7df9-4af1-bd70-4d4ab4f7363b. {{(pid=59534) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 722.136031] env[59534]: DEBUG oslo_concurrency.lockutils [req-9cdc05de-de0c-4d59-b100-398fff9ac74a req-cba87ac2-15e5-473d-888e-883a22e4b77c service nova] Acquiring lock "refresh_cache-fcdac100-e2fe-434e-87ca-0a174ecfb0e6" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 722.136031] env[59534]: DEBUG oslo_concurrency.lockutils [req-9cdc05de-de0c-4d59-b100-398fff9ac74a req-cba87ac2-15e5-473d-888e-883a22e4b77c service nova] Acquired lock "refresh_cache-fcdac100-e2fe-434e-87ca-0a174ecfb0e6" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 722.136031] env[59534]: DEBUG nova.network.neutron [req-9cdc05de-de0c-4d59-b100-398fff9ac74a req-cba87ac2-15e5-473d-888e-883a22e4b77c service nova] [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] Refreshing network info cache for port 938a7aad-7df9-4af1-bd70-4d4ab4f7363b {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 722.263418] env[59534]: DEBUG nova.network.neutron [req-9cdc05de-de0c-4d59-b100-398fff9ac74a req-cba87ac2-15e5-473d-888e-883a22e4b77c service nova] [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 722.625978] env[59534]: DEBUG nova.network.neutron [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] Successfully created port: 9baae0bb-f909-4f91-84e5-58b3057040a8 {{(pid=59534) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 722.800177] env[59534]: DEBUG nova.network.neutron [-] [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 722.809204] env[59534]: INFO nova.compute.manager [-] [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] Took 3.27 seconds to deallocate network for instance. [ 722.811359] env[59534]: DEBUG nova.compute.claims [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] Aborting claim: {{(pid=59534) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 722.811723] env[59534]: DEBUG oslo_concurrency.lockutils [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 722.811992] env[59534]: DEBUG oslo_concurrency.lockutils [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 723.011979] env[59534]: DEBUG nova.network.neutron [req-9cdc05de-de0c-4d59-b100-398fff9ac74a req-cba87ac2-15e5-473d-888e-883a22e4b77c service nova] [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 723.022637] env[59534]: DEBUG oslo_concurrency.lockutils [req-9cdc05de-de0c-4d59-b100-398fff9ac74a req-cba87ac2-15e5-473d-888e-883a22e4b77c service nova] Releasing lock "refresh_cache-fcdac100-e2fe-434e-87ca-0a174ecfb0e6" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 723.041647] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44146dc8-541b-4dcd-83a3-ad0071b56f89 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 723.054781] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb711645-1828-4721-b1c6-ecdc9bc7fa32 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 723.102999] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9f7d06c-c47d-43a8-9847-ce6e3a5a2a2e {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 723.108038] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb1f2712-86c7-45b1-ad42-1b8fedc0891e {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 723.122066] env[59534]: DEBUG nova.compute.provider_tree [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 723.135388] env[59534]: DEBUG nova.network.neutron [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] Successfully created port: 35600a1d-ee8f-4c43-bc89-58c48eb01c64 {{(pid=59534) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 723.142377] env[59534]: DEBUG nova.scheduler.client.report [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 723.166873] env[59534]: DEBUG oslo_concurrency.lockutils [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.354s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 723.167764] env[59534]: ERROR nova.compute.manager [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 938a7aad-7df9-4af1-bd70-4d4ab4f7363b, please check neutron logs for more information. [ 723.167764] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] Traceback (most recent call last): [ 723.167764] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 723.167764] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] self.driver.spawn(context, instance, image_meta, [ 723.167764] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 723.167764] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 723.167764] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 723.167764] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] vm_ref = self.build_virtual_machine(instance, [ 723.167764] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 723.167764] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] vif_infos = vmwarevif.get_vif_info(self._session, [ 723.167764] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 723.168104] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] for vif in network_info: [ 723.168104] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 723.168104] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] return self._sync_wrapper(fn, *args, **kwargs) [ 723.168104] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 723.168104] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] self.wait() [ 723.168104] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 723.168104] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] self[:] = self._gt.wait() [ 723.168104] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 723.168104] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] return self._exit_event.wait() [ 723.168104] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 723.168104] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] result = hub.switch() [ 723.168104] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 723.168104] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] return self.greenlet.switch() [ 723.168104] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 723.168586] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] result = function(*args, **kwargs) [ 723.168586] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 723.168586] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] return func(*args, **kwargs) [ 723.168586] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 723.168586] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] raise e [ 723.168586] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 723.168586] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] nwinfo = self.network_api.allocate_for_instance( [ 723.168586] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 723.168586] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] created_port_ids = self._update_ports_for_instance( [ 723.168586] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 723.168586] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] with excutils.save_and_reraise_exception(): [ 723.168586] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 723.168586] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] self.force_reraise() [ 723.169014] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 723.169014] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] raise self.value [ 723.169014] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 723.169014] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] updated_port = self._update_port( [ 723.169014] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 723.169014] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] _ensure_no_port_binding_failure(port) [ 723.169014] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 723.169014] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] raise exception.PortBindingFailed(port_id=port['id']) [ 723.169014] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] nova.exception.PortBindingFailed: Binding failed for port 938a7aad-7df9-4af1-bd70-4d4ab4f7363b, please check neutron logs for more information. [ 723.169014] env[59534]: ERROR nova.compute.manager [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] [ 723.169014] env[59534]: DEBUG nova.compute.utils [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] Binding failed for port 938a7aad-7df9-4af1-bd70-4d4ab4f7363b, please check neutron logs for more information. {{(pid=59534) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 723.171848] env[59534]: DEBUG nova.compute.manager [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] Build of instance fcdac100-e2fe-434e-87ca-0a174ecfb0e6 was re-scheduled: Binding failed for port 938a7aad-7df9-4af1-bd70-4d4ab4f7363b, please check neutron logs for more information. {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 723.171848] env[59534]: DEBUG nova.compute.manager [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] Unplugging VIFs for instance {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 723.172158] env[59534]: DEBUG oslo_concurrency.lockutils [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Acquiring lock "refresh_cache-fcdac100-e2fe-434e-87ca-0a174ecfb0e6" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 723.172199] env[59534]: DEBUG oslo_concurrency.lockutils [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Acquired lock "refresh_cache-fcdac100-e2fe-434e-87ca-0a174ecfb0e6" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 723.172443] env[59534]: DEBUG nova.network.neutron [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 723.257085] env[59534]: DEBUG nova.network.neutron [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 723.778754] env[59534]: DEBUG nova.network.neutron [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 723.791304] env[59534]: DEBUG oslo_concurrency.lockutils [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Releasing lock "refresh_cache-fcdac100-e2fe-434e-87ca-0a174ecfb0e6" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 723.791989] env[59534]: DEBUG nova.compute.manager [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 723.792502] env[59534]: DEBUG nova.compute.manager [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 723.792717] env[59534]: DEBUG nova.network.neutron [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 723.887512] env[59534]: DEBUG nova.network.neutron [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 723.902288] env[59534]: DEBUG nova.network.neutron [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 723.912655] env[59534]: INFO nova.compute.manager [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: fcdac100-e2fe-434e-87ca-0a174ecfb0e6] Took 0.12 seconds to deallocate network for instance. [ 723.930184] env[59534]: DEBUG nova.network.neutron [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] Successfully created port: f10be37c-9278-4524-a459-645ac91b41fa {{(pid=59534) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 724.014093] env[59534]: INFO nova.scheduler.client.report [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Deleted allocations for instance fcdac100-e2fe-434e-87ca-0a174ecfb0e6 [ 724.033043] env[59534]: DEBUG oslo_concurrency.lockutils [None req-e06fa61f-8129-4316-9346-c7d6ce47d124 tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Lock "fcdac100-e2fe-434e-87ca-0a174ecfb0e6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 26.698s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 724.055602] env[59534]: DEBUG nova.compute.manager [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] Starting instance... {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 724.111557] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 724.111889] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 724.113536] env[59534]: INFO nova.compute.claims [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 724.361955] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bde122ad-77dd-46e7-9925-b37dc1afa4c2 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 724.370672] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-385ccb4c-ef01-4874-953f-aaf0479f459b {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 724.406866] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb16c635-0726-4bed-a4a0-9b8e8a5702cf {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 724.415589] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15328e12-fa4c-4dff-9a77-bed9bb821e0d {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 724.430360] env[59534]: DEBUG nova.compute.provider_tree [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 724.442386] env[59534]: DEBUG nova.scheduler.client.report [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 724.464600] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.352s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 724.464600] env[59534]: DEBUG nova.compute.manager [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] Start building networks asynchronously for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 724.501551] env[59534]: DEBUG nova.compute.utils [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] Using /dev/sd instead of None {{(pid=59534) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 724.503177] env[59534]: DEBUG nova.compute.manager [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] Allocating IP information in the background. {{(pid=59534) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 724.503355] env[59534]: DEBUG nova.network.neutron [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] allocate_for_instance() {{(pid=59534) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 724.512585] env[59534]: DEBUG nova.compute.manager [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] Start building block device mappings for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 724.595437] env[59534]: DEBUG nova.compute.manager [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] Start spawning the instance on the hypervisor. {{(pid=59534) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 724.620222] env[59534]: DEBUG nova.virt.hardware [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T19:44:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T19:44:03Z,direct_url=,disk_format='vmdk',id=ca8542a2-3ba7-4624-b2be-cd49a340ac21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='31a329da9a7b4c98a2734a8492bf7dec',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T19:44:04Z,virtual_size=,visibility=), allow threads: False {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 724.620579] env[59534]: DEBUG nova.virt.hardware [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] Flavor limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 724.620732] env[59534]: DEBUG nova.virt.hardware [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] Image limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 724.620903] env[59534]: DEBUG nova.virt.hardware [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] Flavor pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 724.621056] env[59534]: DEBUG nova.virt.hardware [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] Image pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 724.621498] env[59534]: DEBUG nova.virt.hardware [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 724.621498] env[59534]: DEBUG nova.virt.hardware [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 724.621577] env[59534]: DEBUG nova.virt.hardware [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 724.622171] env[59534]: DEBUG nova.virt.hardware [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] Got 1 possible topologies {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 724.622171] env[59534]: DEBUG nova.virt.hardware [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 724.622171] env[59534]: DEBUG nova.virt.hardware [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 724.623026] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a017ad9-16c3-4b1f-9414-7a527d8b21cf {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 724.631475] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f591d63-c9d3-4c07-95b6-7664337e774c {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 724.738902] env[59534]: DEBUG nova.policy [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7a9465d6fc7b451d9fca4dc4a011d30c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1b7274f48f0a4f9cb467ccb4573b1052', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59534) authorize /opt/stack/nova/nova/policy.py:203}} [ 725.075589] env[59534]: DEBUG oslo_concurrency.lockutils [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] Acquiring lock "e11cc8c7-86ac-49a6-b4b5-2daedf0066bb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 725.076134] env[59534]: DEBUG oslo_concurrency.lockutils [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] Lock "e11cc8c7-86ac-49a6-b4b5-2daedf0066bb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 725.395048] env[59534]: ERROR nova.compute.manager [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 3feb5ed9-5faf-43db-b885-5f6af87fd8eb, please check neutron logs for more information. [ 725.395048] env[59534]: ERROR nova.compute.manager Traceback (most recent call last): [ 725.395048] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 725.395048] env[59534]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 725.395048] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 725.395048] env[59534]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 725.395048] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 725.395048] env[59534]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 725.395048] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 725.395048] env[59534]: ERROR nova.compute.manager self.force_reraise() [ 725.395048] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 725.395048] env[59534]: ERROR nova.compute.manager raise self.value [ 725.395048] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 725.395048] env[59534]: ERROR nova.compute.manager updated_port = self._update_port( [ 725.395048] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 725.395048] env[59534]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 725.395548] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 725.395548] env[59534]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 725.395548] env[59534]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 3feb5ed9-5faf-43db-b885-5f6af87fd8eb, please check neutron logs for more information. [ 725.395548] env[59534]: ERROR nova.compute.manager [ 725.395548] env[59534]: Traceback (most recent call last): [ 725.395548] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 725.395548] env[59534]: listener.cb(fileno) [ 725.395548] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 725.395548] env[59534]: result = function(*args, **kwargs) [ 725.395548] env[59534]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 725.395548] env[59534]: return func(*args, **kwargs) [ 725.395548] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 725.395548] env[59534]: raise e [ 725.395548] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 725.395548] env[59534]: nwinfo = self.network_api.allocate_for_instance( [ 725.395548] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 725.395548] env[59534]: created_port_ids = self._update_ports_for_instance( [ 725.395548] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 725.395548] env[59534]: with excutils.save_and_reraise_exception(): [ 725.395548] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 725.395548] env[59534]: self.force_reraise() [ 725.395548] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 725.395548] env[59534]: raise self.value [ 725.395548] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 725.395548] env[59534]: updated_port = self._update_port( [ 725.395548] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 725.395548] env[59534]: _ensure_no_port_binding_failure(port) [ 725.395548] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 725.395548] env[59534]: raise exception.PortBindingFailed(port_id=port['id']) [ 725.396430] env[59534]: nova.exception.PortBindingFailed: Binding failed for port 3feb5ed9-5faf-43db-b885-5f6af87fd8eb, please check neutron logs for more information. [ 725.396430] env[59534]: Removing descriptor: 19 [ 725.396430] env[59534]: ERROR nova.compute.manager [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 3feb5ed9-5faf-43db-b885-5f6af87fd8eb, please check neutron logs for more information. [ 725.396430] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] Traceback (most recent call last): [ 725.396430] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 725.396430] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] yield resources [ 725.396430] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 725.396430] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] self.driver.spawn(context, instance, image_meta, [ 725.396430] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 725.396430] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] self._vmops.spawn(context, instance, image_meta, injected_files, [ 725.396430] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 725.396430] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] vm_ref = self.build_virtual_machine(instance, [ 725.396788] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 725.396788] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] vif_infos = vmwarevif.get_vif_info(self._session, [ 725.396788] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 725.396788] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] for vif in network_info: [ 725.396788] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 725.396788] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] return self._sync_wrapper(fn, *args, **kwargs) [ 725.396788] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 725.396788] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] self.wait() [ 725.396788] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 725.396788] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] self[:] = self._gt.wait() [ 725.396788] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 725.396788] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] return self._exit_event.wait() [ 725.396788] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 725.396788] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] result = hub.switch() [ 725.397620] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 725.397620] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] return self.greenlet.switch() [ 725.397620] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 725.397620] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] result = function(*args, **kwargs) [ 725.397620] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 725.397620] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] return func(*args, **kwargs) [ 725.397620] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 725.397620] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] raise e [ 725.397620] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 725.397620] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] nwinfo = self.network_api.allocate_for_instance( [ 725.397620] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 725.397620] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] created_port_ids = self._update_ports_for_instance( [ 725.397620] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 725.398032] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] with excutils.save_and_reraise_exception(): [ 725.398032] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 725.398032] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] self.force_reraise() [ 725.398032] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 725.398032] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] raise self.value [ 725.398032] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 725.398032] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] updated_port = self._update_port( [ 725.398032] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 725.398032] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] _ensure_no_port_binding_failure(port) [ 725.398032] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 725.398032] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] raise exception.PortBindingFailed(port_id=port['id']) [ 725.398032] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] nova.exception.PortBindingFailed: Binding failed for port 3feb5ed9-5faf-43db-b885-5f6af87fd8eb, please check neutron logs for more information. [ 725.398032] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] [ 725.398432] env[59534]: INFO nova.compute.manager [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] Terminating instance [ 725.399276] env[59534]: DEBUG oslo_concurrency.lockutils [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] Acquiring lock "refresh_cache-9ef14f08-3ae3-48cb-ae99-2b2731faeeab" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 725.399435] env[59534]: DEBUG oslo_concurrency.lockutils [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] Acquired lock "refresh_cache-9ef14f08-3ae3-48cb-ae99-2b2731faeeab" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 725.399590] env[59534]: DEBUG nova.network.neutron [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 725.473168] env[59534]: DEBUG nova.network.neutron [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 725.781912] env[59534]: DEBUG nova.network.neutron [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 725.792955] env[59534]: DEBUG oslo_concurrency.lockutils [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] Releasing lock "refresh_cache-9ef14f08-3ae3-48cb-ae99-2b2731faeeab" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 725.793427] env[59534]: DEBUG nova.compute.manager [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] Start destroying the instance on the hypervisor. {{(pid=59534) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 725.793622] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] Destroying instance {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 725.794145] env[59534]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-78def77e-fd97-4c99-bc66-78356d95e593 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 725.803891] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5821e8b0-642b-4322-a5fe-d25ce2331773 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 725.827463] env[59534]: WARNING nova.virt.vmwareapi.vmops [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 9ef14f08-3ae3-48cb-ae99-2b2731faeeab could not be found. [ 725.827677] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] Instance destroyed {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 725.827854] env[59534]: INFO nova.compute.manager [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] Took 0.03 seconds to destroy the instance on the hypervisor. [ 725.828102] env[59534]: DEBUG oslo.service.loopingcall [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59534) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 725.828322] env[59534]: DEBUG nova.compute.manager [-] [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 725.828417] env[59534]: DEBUG nova.network.neutron [-] [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 725.869403] env[59534]: DEBUG nova.network.neutron [-] [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 725.877887] env[59534]: DEBUG nova.network.neutron [-] [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 725.887760] env[59534]: INFO nova.compute.manager [-] [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] Took 0.06 seconds to deallocate network for instance. [ 725.890523] env[59534]: DEBUG nova.compute.claims [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] Aborting claim: {{(pid=59534) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 725.890703] env[59534]: DEBUG oslo_concurrency.lockutils [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 725.890913] env[59534]: DEBUG oslo_concurrency.lockutils [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 726.093718] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-acb06717-cc79-46a9-bb8c-4614d0bdbb3e {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 726.102407] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a61efc3f-b4c3-4988-aaaa-e8594533b00e {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 726.138127] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5935dee-a08e-4789-9b23-8ccdd282147a {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 726.146661] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-686176e9-7b05-46b8-9e55-5a66f6e1e103 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 726.163814] env[59534]: DEBUG nova.compute.provider_tree [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 726.175070] env[59534]: DEBUG nova.scheduler.client.report [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 726.194436] env[59534]: DEBUG oslo_concurrency.lockutils [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.303s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 726.195256] env[59534]: ERROR nova.compute.manager [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 3feb5ed9-5faf-43db-b885-5f6af87fd8eb, please check neutron logs for more information. [ 726.195256] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] Traceback (most recent call last): [ 726.195256] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 726.195256] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] self.driver.spawn(context, instance, image_meta, [ 726.195256] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 726.195256] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] self._vmops.spawn(context, instance, image_meta, injected_files, [ 726.195256] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 726.195256] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] vm_ref = self.build_virtual_machine(instance, [ 726.195256] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 726.195256] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] vif_infos = vmwarevif.get_vif_info(self._session, [ 726.195256] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 726.195598] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] for vif in network_info: [ 726.195598] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 726.195598] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] return self._sync_wrapper(fn, *args, **kwargs) [ 726.195598] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 726.195598] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] self.wait() [ 726.195598] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 726.195598] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] self[:] = self._gt.wait() [ 726.195598] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 726.195598] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] return self._exit_event.wait() [ 726.195598] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 726.195598] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] result = hub.switch() [ 726.195598] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 726.195598] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] return self.greenlet.switch() [ 726.195598] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 726.196867] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] result = function(*args, **kwargs) [ 726.196867] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 726.196867] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] return func(*args, **kwargs) [ 726.196867] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 726.196867] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] raise e [ 726.196867] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 726.196867] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] nwinfo = self.network_api.allocate_for_instance( [ 726.196867] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 726.196867] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] created_port_ids = self._update_ports_for_instance( [ 726.196867] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 726.196867] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] with excutils.save_and_reraise_exception(): [ 726.196867] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 726.196867] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] self.force_reraise() [ 726.197240] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 726.197240] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] raise self.value [ 726.197240] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 726.197240] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] updated_port = self._update_port( [ 726.197240] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 726.197240] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] _ensure_no_port_binding_failure(port) [ 726.197240] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 726.197240] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] raise exception.PortBindingFailed(port_id=port['id']) [ 726.197240] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] nova.exception.PortBindingFailed: Binding failed for port 3feb5ed9-5faf-43db-b885-5f6af87fd8eb, please check neutron logs for more information. [ 726.197240] env[59534]: ERROR nova.compute.manager [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] [ 726.198312] env[59534]: DEBUG nova.compute.utils [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] Binding failed for port 3feb5ed9-5faf-43db-b885-5f6af87fd8eb, please check neutron logs for more information. {{(pid=59534) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 726.200153] env[59534]: DEBUG nova.compute.manager [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] Build of instance 9ef14f08-3ae3-48cb-ae99-2b2731faeeab was re-scheduled: Binding failed for port 3feb5ed9-5faf-43db-b885-5f6af87fd8eb, please check neutron logs for more information. {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 726.200572] env[59534]: DEBUG nova.compute.manager [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] Unplugging VIFs for instance {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 726.200847] env[59534]: DEBUG oslo_concurrency.lockutils [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] Acquiring lock "refresh_cache-9ef14f08-3ae3-48cb-ae99-2b2731faeeab" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 726.200927] env[59534]: DEBUG oslo_concurrency.lockutils [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] Acquired lock "refresh_cache-9ef14f08-3ae3-48cb-ae99-2b2731faeeab" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 726.201510] env[59534]: DEBUG nova.network.neutron [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 726.239896] env[59534]: DEBUG nova.network.neutron [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 726.279706] env[59534]: ERROR nova.compute.manager [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port b536cc08-98fb-4ed0-969d-32accca2b5e2, please check neutron logs for more information. [ 726.279706] env[59534]: ERROR nova.compute.manager Traceback (most recent call last): [ 726.279706] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 726.279706] env[59534]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 726.279706] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 726.279706] env[59534]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 726.279706] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 726.279706] env[59534]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 726.279706] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 726.279706] env[59534]: ERROR nova.compute.manager self.force_reraise() [ 726.279706] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 726.279706] env[59534]: ERROR nova.compute.manager raise self.value [ 726.279706] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 726.279706] env[59534]: ERROR nova.compute.manager updated_port = self._update_port( [ 726.279706] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 726.279706] env[59534]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 726.281101] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 726.281101] env[59534]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 726.281101] env[59534]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port b536cc08-98fb-4ed0-969d-32accca2b5e2, please check neutron logs for more information. [ 726.281101] env[59534]: ERROR nova.compute.manager [ 726.281101] env[59534]: Traceback (most recent call last): [ 726.281101] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 726.281101] env[59534]: listener.cb(fileno) [ 726.281101] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 726.281101] env[59534]: result = function(*args, **kwargs) [ 726.281101] env[59534]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 726.281101] env[59534]: return func(*args, **kwargs) [ 726.281101] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 726.281101] env[59534]: raise e [ 726.281101] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 726.281101] env[59534]: nwinfo = self.network_api.allocate_for_instance( [ 726.281101] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 726.281101] env[59534]: created_port_ids = self._update_ports_for_instance( [ 726.281101] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 726.281101] env[59534]: with excutils.save_and_reraise_exception(): [ 726.281101] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 726.281101] env[59534]: self.force_reraise() [ 726.281101] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 726.281101] env[59534]: raise self.value [ 726.281101] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 726.281101] env[59534]: updated_port = self._update_port( [ 726.281101] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 726.281101] env[59534]: _ensure_no_port_binding_failure(port) [ 726.281101] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 726.281101] env[59534]: raise exception.PortBindingFailed(port_id=port['id']) [ 726.282786] env[59534]: nova.exception.PortBindingFailed: Binding failed for port b536cc08-98fb-4ed0-969d-32accca2b5e2, please check neutron logs for more information. [ 726.282786] env[59534]: Removing descriptor: 20 [ 726.282786] env[59534]: ERROR nova.compute.manager [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port b536cc08-98fb-4ed0-969d-32accca2b5e2, please check neutron logs for more information. [ 726.282786] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] Traceback (most recent call last): [ 726.282786] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 726.282786] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] yield resources [ 726.282786] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 726.282786] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] self.driver.spawn(context, instance, image_meta, [ 726.282786] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 726.282786] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 726.282786] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 726.282786] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] vm_ref = self.build_virtual_machine(instance, [ 726.283431] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 726.283431] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] vif_infos = vmwarevif.get_vif_info(self._session, [ 726.283431] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 726.283431] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] for vif in network_info: [ 726.283431] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 726.283431] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] return self._sync_wrapper(fn, *args, **kwargs) [ 726.283431] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 726.283431] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] self.wait() [ 726.283431] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 726.283431] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] self[:] = self._gt.wait() [ 726.283431] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 726.283431] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] return self._exit_event.wait() [ 726.283431] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 726.283431] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] result = hub.switch() [ 726.285201] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 726.285201] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] return self.greenlet.switch() [ 726.285201] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 726.285201] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] result = function(*args, **kwargs) [ 726.285201] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 726.285201] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] return func(*args, **kwargs) [ 726.285201] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 726.285201] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] raise e [ 726.285201] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 726.285201] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] nwinfo = self.network_api.allocate_for_instance( [ 726.285201] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 726.285201] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] created_port_ids = self._update_ports_for_instance( [ 726.285201] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 726.285930] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] with excutils.save_and_reraise_exception(): [ 726.285930] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 726.285930] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] self.force_reraise() [ 726.285930] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 726.285930] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] raise self.value [ 726.285930] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 726.285930] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] updated_port = self._update_port( [ 726.285930] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 726.285930] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] _ensure_no_port_binding_failure(port) [ 726.285930] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 726.285930] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] raise exception.PortBindingFailed(port_id=port['id']) [ 726.285930] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] nova.exception.PortBindingFailed: Binding failed for port b536cc08-98fb-4ed0-969d-32accca2b5e2, please check neutron logs for more information. [ 726.285930] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] [ 726.287388] env[59534]: INFO nova.compute.manager [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] Terminating instance [ 726.287388] env[59534]: DEBUG oslo_concurrency.lockutils [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Acquiring lock "refresh_cache-880bddac-daec-4194-8e5d-e7aaba8c2dd1" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 726.287388] env[59534]: DEBUG oslo_concurrency.lockutils [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Acquired lock "refresh_cache-880bddac-daec-4194-8e5d-e7aaba8c2dd1" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 726.287388] env[59534]: DEBUG nova.network.neutron [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 726.355030] env[59534]: DEBUG nova.network.neutron [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 726.361814] env[59534]: DEBUG oslo_concurrency.lockutils [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] Acquiring lock "c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 726.362139] env[59534]: DEBUG oslo_concurrency.lockutils [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] Lock "c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 726.435299] env[59534]: DEBUG nova.network.neutron [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] Successfully created port: 7ee485bb-e516-4317-bb2a-05b5ab3d6574 {{(pid=59534) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 726.479654] env[59534]: DEBUG nova.network.neutron [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 726.494691] env[59534]: DEBUG oslo_concurrency.lockutils [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] Releasing lock "refresh_cache-9ef14f08-3ae3-48cb-ae99-2b2731faeeab" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 726.494941] env[59534]: DEBUG nova.compute.manager [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 726.495282] env[59534]: DEBUG nova.compute.manager [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 726.495506] env[59534]: DEBUG nova.network.neutron [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 726.541069] env[59534]: DEBUG nova.network.neutron [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 726.549415] env[59534]: DEBUG nova.network.neutron [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 726.560121] env[59534]: INFO nova.compute.manager [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] [instance: 9ef14f08-3ae3-48cb-ae99-2b2731faeeab] Took 0.06 seconds to deallocate network for instance. [ 726.689239] env[59534]: INFO nova.scheduler.client.report [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] Deleted allocations for instance 9ef14f08-3ae3-48cb-ae99-2b2731faeeab [ 726.714288] env[59534]: DEBUG oslo_concurrency.lockutils [None req-5fef9c7b-f2b7-4e82-9bf2-4ab1e3cd5b76 tempest-SecurityGroupsTestJSON-1311240703 tempest-SecurityGroupsTestJSON-1311240703-project-member] Lock "9ef14f08-3ae3-48cb-ae99-2b2731faeeab" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 9.205s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 726.720163] env[59534]: ERROR nova.compute.manager [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 35600a1d-ee8f-4c43-bc89-58c48eb01c64, please check neutron logs for more information. [ 726.720163] env[59534]: ERROR nova.compute.manager Traceback (most recent call last): [ 726.720163] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 726.720163] env[59534]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 726.720163] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 726.720163] env[59534]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 726.720163] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 726.720163] env[59534]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 726.720163] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 726.720163] env[59534]: ERROR nova.compute.manager self.force_reraise() [ 726.720163] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 726.720163] env[59534]: ERROR nova.compute.manager raise self.value [ 726.720163] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 726.720163] env[59534]: ERROR nova.compute.manager updated_port = self._update_port( [ 726.720163] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 726.720163] env[59534]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 726.720602] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 726.720602] env[59534]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 726.720602] env[59534]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 35600a1d-ee8f-4c43-bc89-58c48eb01c64, please check neutron logs for more information. [ 726.720602] env[59534]: ERROR nova.compute.manager [ 726.720602] env[59534]: Traceback (most recent call last): [ 726.720602] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 726.720602] env[59534]: listener.cb(fileno) [ 726.720602] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 726.720602] env[59534]: result = function(*args, **kwargs) [ 726.720602] env[59534]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 726.720602] env[59534]: return func(*args, **kwargs) [ 726.720602] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 726.720602] env[59534]: raise e [ 726.720602] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 726.720602] env[59534]: nwinfo = self.network_api.allocate_for_instance( [ 726.720602] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 726.720602] env[59534]: created_port_ids = self._update_ports_for_instance( [ 726.720602] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 726.720602] env[59534]: with excutils.save_and_reraise_exception(): [ 726.720602] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 726.720602] env[59534]: self.force_reraise() [ 726.720602] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 726.720602] env[59534]: raise self.value [ 726.720602] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 726.720602] env[59534]: updated_port = self._update_port( [ 726.720602] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 726.720602] env[59534]: _ensure_no_port_binding_failure(port) [ 726.720602] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 726.720602] env[59534]: raise exception.PortBindingFailed(port_id=port['id']) [ 726.721342] env[59534]: nova.exception.PortBindingFailed: Binding failed for port 35600a1d-ee8f-4c43-bc89-58c48eb01c64, please check neutron logs for more information. [ 726.721342] env[59534]: Removing descriptor: 15 [ 726.722303] env[59534]: ERROR nova.compute.manager [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 35600a1d-ee8f-4c43-bc89-58c48eb01c64, please check neutron logs for more information. [ 726.722303] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] Traceback (most recent call last): [ 726.722303] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 726.722303] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] yield resources [ 726.722303] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 726.722303] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] self.driver.spawn(context, instance, image_meta, [ 726.722303] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 726.722303] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 726.722303] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 726.722303] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] vm_ref = self.build_virtual_machine(instance, [ 726.722303] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 726.722654] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] vif_infos = vmwarevif.get_vif_info(self._session, [ 726.722654] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 726.722654] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] for vif in network_info: [ 726.722654] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 726.722654] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] return self._sync_wrapper(fn, *args, **kwargs) [ 726.722654] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 726.722654] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] self.wait() [ 726.722654] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 726.722654] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] self[:] = self._gt.wait() [ 726.722654] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 726.722654] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] return self._exit_event.wait() [ 726.722654] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 726.722654] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] result = hub.switch() [ 726.722654] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 726.723019] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] return self.greenlet.switch() [ 726.723019] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 726.723019] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] result = function(*args, **kwargs) [ 726.723019] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 726.723019] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] return func(*args, **kwargs) [ 726.723019] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 726.723019] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] raise e [ 726.723019] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 726.723019] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] nwinfo = self.network_api.allocate_for_instance( [ 726.723019] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 726.723019] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] created_port_ids = self._update_ports_for_instance( [ 726.723019] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 726.723019] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] with excutils.save_and_reraise_exception(): [ 726.723469] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 726.723469] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] self.force_reraise() [ 726.723469] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 726.723469] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] raise self.value [ 726.723469] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 726.723469] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] updated_port = self._update_port( [ 726.723469] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 726.723469] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] _ensure_no_port_binding_failure(port) [ 726.723469] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 726.723469] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] raise exception.PortBindingFailed(port_id=port['id']) [ 726.723469] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] nova.exception.PortBindingFailed: Binding failed for port 35600a1d-ee8f-4c43-bc89-58c48eb01c64, please check neutron logs for more information. [ 726.723469] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] [ 726.723761] env[59534]: INFO nova.compute.manager [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] Terminating instance [ 726.725885] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Acquiring lock "refresh_cache-de33d275-2cf8-4726-9d1a-526d7556a74f" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 726.726104] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Acquired lock "refresh_cache-de33d275-2cf8-4726-9d1a-526d7556a74f" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 726.726307] env[59534]: DEBUG nova.network.neutron [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 726.745953] env[59534]: DEBUG nova.compute.manager [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] Starting instance... {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 726.750245] env[59534]: DEBUG nova.network.neutron [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 726.758257] env[59534]: DEBUG oslo_concurrency.lockutils [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Releasing lock "refresh_cache-880bddac-daec-4194-8e5d-e7aaba8c2dd1" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 726.762041] env[59534]: DEBUG nova.compute.manager [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] Start destroying the instance on the hypervisor. {{(pid=59534) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 726.762041] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] Destroying instance {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 726.762041] env[59534]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-371c79ce-c5be-4924-a87a-0602c0681e53 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 726.769874] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0be8cc3a-af4a-4b4f-9537-34fbd22d2607 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 726.789126] env[59534]: DEBUG nova.network.neutron [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 726.800633] env[59534]: WARNING nova.virt.vmwareapi.vmops [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 880bddac-daec-4194-8e5d-e7aaba8c2dd1 could not be found. [ 726.801116] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] Instance destroyed {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 726.801390] env[59534]: INFO nova.compute.manager [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] Took 0.04 seconds to destroy the instance on the hypervisor. [ 726.801648] env[59534]: DEBUG oslo.service.loopingcall [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59534) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 726.804472] env[59534]: DEBUG nova.compute.manager [-] [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 726.804853] env[59534]: DEBUG nova.network.neutron [-] [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 726.828821] env[59534]: DEBUG oslo_concurrency.lockutils [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 726.828821] env[59534]: DEBUG oslo_concurrency.lockutils [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 726.828821] env[59534]: INFO nova.compute.claims [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 726.850013] env[59534]: DEBUG nova.network.neutron [-] [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 726.859581] env[59534]: DEBUG nova.network.neutron [-] [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 726.876235] env[59534]: INFO nova.compute.manager [-] [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] Took 0.07 seconds to deallocate network for instance. [ 726.878495] env[59534]: DEBUG nova.compute.claims [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] Aborting claim: {{(pid=59534) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 726.878669] env[59534]: DEBUG oslo_concurrency.lockutils [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 726.945209] env[59534]: DEBUG oslo_concurrency.lockutils [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] Acquiring lock "cbd11b11-3927-4f2f-81c5-9e282f432273" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 726.945209] env[59534]: DEBUG oslo_concurrency.lockutils [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] Lock "cbd11b11-3927-4f2f-81c5-9e282f432273" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 726.989384] env[59534]: DEBUG nova.network.neutron [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 726.998288] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Releasing lock "refresh_cache-de33d275-2cf8-4726-9d1a-526d7556a74f" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 726.998788] env[59534]: DEBUG nova.compute.manager [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] Start destroying the instance on the hypervisor. {{(pid=59534) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 726.998896] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] Destroying instance {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 727.001797] env[59534]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-20fa0154-ebd1-4160-8aef-a6365b26df62 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 727.011944] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e0b851a-af16-4592-a7f5-c3a8cfac5fc0 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 727.039026] env[59534]: WARNING nova.virt.vmwareapi.vmops [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance de33d275-2cf8-4726-9d1a-526d7556a74f could not be found. [ 727.039305] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] Instance destroyed {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 727.039448] env[59534]: INFO nova.compute.manager [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] Took 0.04 seconds to destroy the instance on the hypervisor. [ 727.039687] env[59534]: DEBUG oslo.service.loopingcall [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59534) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 727.042424] env[59534]: DEBUG nova.compute.manager [-] [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 727.042526] env[59534]: DEBUG nova.network.neutron [-] [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 727.064445] env[59534]: DEBUG nova.network.neutron [-] [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 727.081019] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf5fe17c-68c7-4a60-9970-0035c1cd737a {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 727.084272] env[59534]: DEBUG nova.network.neutron [-] [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 727.094681] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a4b28ed-db02-44f2-bef2-6beaffa88438 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 727.128442] env[59534]: INFO nova.compute.manager [-] [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] Took 0.09 seconds to deallocate network for instance. [ 727.129191] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b207b158-181f-4fb3-84f9-4bf577a9f1db {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 727.135860] env[59534]: DEBUG nova.compute.claims [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] Aborting claim: {{(pid=59534) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 727.135860] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 727.139065] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f621010e-15f9-4683-97f3-1c7ce55e8d04 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 727.152929] env[59534]: DEBUG nova.compute.provider_tree [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 727.163666] env[59534]: DEBUG nova.scheduler.client.report [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 727.179223] env[59534]: DEBUG oslo_concurrency.lockutils [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.353s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 727.179834] env[59534]: DEBUG nova.compute.manager [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] Start building networks asynchronously for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 727.182613] env[59534]: DEBUG oslo_concurrency.lockutils [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.304s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 727.222923] env[59534]: DEBUG nova.compute.utils [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] Using /dev/sd instead of None {{(pid=59534) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 727.225154] env[59534]: DEBUG nova.compute.manager [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] Allocating IP information in the background. {{(pid=59534) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 727.225331] env[59534]: DEBUG nova.network.neutron [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] allocate_for_instance() {{(pid=59534) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 727.235408] env[59534]: DEBUG nova.compute.manager [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] Start building block device mappings for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 727.308736] env[59534]: DEBUG nova.compute.manager [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] Start spawning the instance on the hypervisor. {{(pid=59534) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 727.336732] env[59534]: DEBUG nova.virt.hardware [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T19:44:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T19:44:03Z,direct_url=,disk_format='vmdk',id=ca8542a2-3ba7-4624-b2be-cd49a340ac21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='31a329da9a7b4c98a2734a8492bf7dec',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T19:44:04Z,virtual_size=,visibility=), allow threads: False {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 727.336894] env[59534]: DEBUG nova.virt.hardware [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] Flavor limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 727.336988] env[59534]: DEBUG nova.virt.hardware [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] Image limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 727.337193] env[59534]: DEBUG nova.virt.hardware [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] Flavor pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 727.337341] env[59534]: DEBUG nova.virt.hardware [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] Image pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 727.338366] env[59534]: DEBUG nova.virt.hardware [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 727.338366] env[59534]: DEBUG nova.virt.hardware [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 727.338366] env[59534]: DEBUG nova.virt.hardware [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 727.338366] env[59534]: DEBUG nova.virt.hardware [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] Got 1 possible topologies {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 727.338585] env[59534]: DEBUG nova.virt.hardware [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 727.338585] env[59534]: DEBUG nova.virt.hardware [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 727.339457] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-179d9d5c-57e1-4456-9d8f-a8572bc2f032 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 727.351014] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-663c5df9-7a1a-40fd-b4d7-ca7a2a6236e9 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 727.409635] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e74b7bb-0f3c-4791-89a5-8ae88ce1a6a2 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 727.416464] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05d2ae92-58f5-4f6e-ad28-d5e21704a90f {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 727.447929] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d832f76-73c0-4eaf-8a4e-63cad4f7e8b6 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 727.454639] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4361a2cd-8012-4bfb-b00f-a98c72225af6 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 727.467687] env[59534]: DEBUG nova.compute.provider_tree [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 727.480140] env[59534]: DEBUG nova.scheduler.client.report [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 727.494020] env[59534]: DEBUG oslo_concurrency.lockutils [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.311s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 727.494623] env[59534]: ERROR nova.compute.manager [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port b536cc08-98fb-4ed0-969d-32accca2b5e2, please check neutron logs for more information. [ 727.494623] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] Traceback (most recent call last): [ 727.494623] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 727.494623] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] self.driver.spawn(context, instance, image_meta, [ 727.494623] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 727.494623] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 727.494623] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 727.494623] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] vm_ref = self.build_virtual_machine(instance, [ 727.494623] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 727.494623] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] vif_infos = vmwarevif.get_vif_info(self._session, [ 727.494623] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 727.494971] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] for vif in network_info: [ 727.494971] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 727.494971] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] return self._sync_wrapper(fn, *args, **kwargs) [ 727.494971] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 727.494971] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] self.wait() [ 727.494971] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 727.494971] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] self[:] = self._gt.wait() [ 727.494971] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 727.494971] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] return self._exit_event.wait() [ 727.494971] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 727.494971] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] result = hub.switch() [ 727.494971] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 727.494971] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] return self.greenlet.switch() [ 727.494971] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 727.495364] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] result = function(*args, **kwargs) [ 727.495364] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 727.495364] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] return func(*args, **kwargs) [ 727.495364] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 727.495364] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] raise e [ 727.495364] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 727.495364] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] nwinfo = self.network_api.allocate_for_instance( [ 727.495364] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 727.495364] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] created_port_ids = self._update_ports_for_instance( [ 727.495364] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 727.495364] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] with excutils.save_and_reraise_exception(): [ 727.495364] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 727.495364] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] self.force_reraise() [ 727.495668] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 727.495668] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] raise self.value [ 727.495668] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 727.495668] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] updated_port = self._update_port( [ 727.495668] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 727.495668] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] _ensure_no_port_binding_failure(port) [ 727.495668] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 727.495668] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] raise exception.PortBindingFailed(port_id=port['id']) [ 727.495668] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] nova.exception.PortBindingFailed: Binding failed for port b536cc08-98fb-4ed0-969d-32accca2b5e2, please check neutron logs for more information. [ 727.495668] env[59534]: ERROR nova.compute.manager [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] [ 727.495668] env[59534]: DEBUG nova.compute.utils [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] Binding failed for port b536cc08-98fb-4ed0-969d-32accca2b5e2, please check neutron logs for more information. {{(pid=59534) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 727.496436] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.362s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 727.499176] env[59534]: DEBUG nova.compute.manager [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] Build of instance 880bddac-daec-4194-8e5d-e7aaba8c2dd1 was re-scheduled: Binding failed for port b536cc08-98fb-4ed0-969d-32accca2b5e2, please check neutron logs for more information. {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 727.499624] env[59534]: DEBUG nova.compute.manager [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] Unplugging VIFs for instance {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 727.499842] env[59534]: DEBUG oslo_concurrency.lockutils [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Acquiring lock "refresh_cache-880bddac-daec-4194-8e5d-e7aaba8c2dd1" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 727.499984] env[59534]: DEBUG oslo_concurrency.lockutils [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Acquired lock "refresh_cache-880bddac-daec-4194-8e5d-e7aaba8c2dd1" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 727.500158] env[59534]: DEBUG nova.network.neutron [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 727.506022] env[59534]: DEBUG nova.policy [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9f00996f20fc4843a60d040f1a4818c6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a71c30f298e84c1abc9921bbe550773e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59534) authorize /opt/stack/nova/nova/policy.py:203}} [ 727.558595] env[59534]: DEBUG nova.network.neutron [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 727.696723] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f7e3719-7dc2-46c1-9cb9-fedf769793bc {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 727.707433] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95537ce4-1524-4a98-99c4-133eb74e0d13 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 727.739472] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e8915958-9fe6-4ef7-88d0-b06f654b8d49 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 727.747348] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d0da5fd-75ae-4a89-ad6e-31497a8e8268 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 727.761461] env[59534]: DEBUG nova.compute.provider_tree [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 727.769909] env[59534]: DEBUG nova.scheduler.client.report [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 727.793573] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.297s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 727.794268] env[59534]: ERROR nova.compute.manager [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 35600a1d-ee8f-4c43-bc89-58c48eb01c64, please check neutron logs for more information. [ 727.794268] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] Traceback (most recent call last): [ 727.794268] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 727.794268] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] self.driver.spawn(context, instance, image_meta, [ 727.794268] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 727.794268] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 727.794268] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 727.794268] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] vm_ref = self.build_virtual_machine(instance, [ 727.794268] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 727.794268] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] vif_infos = vmwarevif.get_vif_info(self._session, [ 727.794268] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 727.794550] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] for vif in network_info: [ 727.794550] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 727.794550] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] return self._sync_wrapper(fn, *args, **kwargs) [ 727.794550] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 727.794550] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] self.wait() [ 727.794550] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 727.794550] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] self[:] = self._gt.wait() [ 727.794550] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 727.794550] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] return self._exit_event.wait() [ 727.794550] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 727.794550] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] result = hub.switch() [ 727.794550] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 727.794550] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] return self.greenlet.switch() [ 727.794550] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 727.794989] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] result = function(*args, **kwargs) [ 727.794989] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 727.794989] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] return func(*args, **kwargs) [ 727.794989] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 727.794989] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] raise e [ 727.794989] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 727.794989] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] nwinfo = self.network_api.allocate_for_instance( [ 727.794989] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 727.794989] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] created_port_ids = self._update_ports_for_instance( [ 727.794989] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 727.794989] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] with excutils.save_and_reraise_exception(): [ 727.794989] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 727.794989] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] self.force_reraise() [ 727.795339] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 727.795339] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] raise self.value [ 727.795339] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 727.795339] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] updated_port = self._update_port( [ 727.795339] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 727.795339] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] _ensure_no_port_binding_failure(port) [ 727.795339] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 727.795339] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] raise exception.PortBindingFailed(port_id=port['id']) [ 727.795339] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] nova.exception.PortBindingFailed: Binding failed for port 35600a1d-ee8f-4c43-bc89-58c48eb01c64, please check neutron logs for more information. [ 727.795339] env[59534]: ERROR nova.compute.manager [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] [ 727.795339] env[59534]: DEBUG nova.compute.utils [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] Binding failed for port 35600a1d-ee8f-4c43-bc89-58c48eb01c64, please check neutron logs for more information. {{(pid=59534) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 727.796830] env[59534]: DEBUG nova.compute.manager [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] Build of instance de33d275-2cf8-4726-9d1a-526d7556a74f was re-scheduled: Binding failed for port 35600a1d-ee8f-4c43-bc89-58c48eb01c64, please check neutron logs for more information. {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 727.797262] env[59534]: DEBUG nova.compute.manager [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] Unplugging VIFs for instance {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 727.797487] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Acquiring lock "refresh_cache-de33d275-2cf8-4726-9d1a-526d7556a74f" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 727.797631] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Acquired lock "refresh_cache-de33d275-2cf8-4726-9d1a-526d7556a74f" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 727.797786] env[59534]: DEBUG nova.network.neutron [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 727.836010] env[59534]: DEBUG nova.network.neutron [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 727.933547] env[59534]: DEBUG oslo_concurrency.lockutils [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Acquiring lock "55002608-7ede-4f13-a820-09f8b7da3edf" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 727.933964] env[59534]: DEBUG oslo_concurrency.lockutils [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Lock "55002608-7ede-4f13-a820-09f8b7da3edf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 727.994847] env[59534]: ERROR nova.compute.manager [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 7a72d80e-8567-4d32-bb2b-d09fce05e08b, please check neutron logs for more information. [ 727.994847] env[59534]: ERROR nova.compute.manager Traceback (most recent call last): [ 727.994847] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 727.994847] env[59534]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 727.994847] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 727.994847] env[59534]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 727.994847] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 727.994847] env[59534]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 727.994847] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 727.994847] env[59534]: ERROR nova.compute.manager self.force_reraise() [ 727.994847] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 727.994847] env[59534]: ERROR nova.compute.manager raise self.value [ 727.994847] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 727.994847] env[59534]: ERROR nova.compute.manager updated_port = self._update_port( [ 727.994847] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 727.994847] env[59534]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 727.995384] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 727.995384] env[59534]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 727.995384] env[59534]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 7a72d80e-8567-4d32-bb2b-d09fce05e08b, please check neutron logs for more information. [ 727.995384] env[59534]: ERROR nova.compute.manager [ 727.995384] env[59534]: Traceback (most recent call last): [ 727.995384] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 727.995384] env[59534]: listener.cb(fileno) [ 727.995384] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 727.995384] env[59534]: result = function(*args, **kwargs) [ 727.995384] env[59534]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 727.995384] env[59534]: return func(*args, **kwargs) [ 727.995384] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 727.995384] env[59534]: raise e [ 727.995384] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 727.995384] env[59534]: nwinfo = self.network_api.allocate_for_instance( [ 727.995384] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 727.995384] env[59534]: created_port_ids = self._update_ports_for_instance( [ 727.995384] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 727.995384] env[59534]: with excutils.save_and_reraise_exception(): [ 727.995384] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 727.995384] env[59534]: self.force_reraise() [ 727.995384] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 727.995384] env[59534]: raise self.value [ 727.995384] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 727.995384] env[59534]: updated_port = self._update_port( [ 727.995384] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 727.995384] env[59534]: _ensure_no_port_binding_failure(port) [ 727.995384] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 727.995384] env[59534]: raise exception.PortBindingFailed(port_id=port['id']) [ 727.996081] env[59534]: nova.exception.PortBindingFailed: Binding failed for port 7a72d80e-8567-4d32-bb2b-d09fce05e08b, please check neutron logs for more information. [ 727.996081] env[59534]: Removing descriptor: 17 [ 727.996081] env[59534]: ERROR nova.compute.manager [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: f1e315bf-9348-4631-af77-85ec3a986a83] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 7a72d80e-8567-4d32-bb2b-d09fce05e08b, please check neutron logs for more information. [ 727.996081] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] Traceback (most recent call last): [ 727.996081] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 727.996081] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] yield resources [ 727.996081] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 727.996081] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] self.driver.spawn(context, instance, image_meta, [ 727.996081] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 727.996081] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] self._vmops.spawn(context, instance, image_meta, injected_files, [ 727.996081] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 727.996081] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] vm_ref = self.build_virtual_machine(instance, [ 727.996428] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 727.996428] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] vif_infos = vmwarevif.get_vif_info(self._session, [ 727.996428] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 727.996428] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] for vif in network_info: [ 727.996428] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 727.996428] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] return self._sync_wrapper(fn, *args, **kwargs) [ 727.996428] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 727.996428] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] self.wait() [ 727.996428] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 727.996428] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] self[:] = self._gt.wait() [ 727.996428] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 727.996428] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] return self._exit_event.wait() [ 727.996428] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 727.996428] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] result = hub.switch() [ 727.996762] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 727.996762] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] return self.greenlet.switch() [ 727.996762] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 727.996762] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] result = function(*args, **kwargs) [ 727.996762] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 727.996762] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] return func(*args, **kwargs) [ 727.996762] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 727.996762] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] raise e [ 727.996762] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 727.996762] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] nwinfo = self.network_api.allocate_for_instance( [ 727.996762] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 727.996762] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] created_port_ids = self._update_ports_for_instance( [ 727.996762] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 727.997354] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] with excutils.save_and_reraise_exception(): [ 727.997354] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 727.997354] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] self.force_reraise() [ 727.997354] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 727.997354] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] raise self.value [ 727.997354] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 727.997354] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] updated_port = self._update_port( [ 727.997354] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 727.997354] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] _ensure_no_port_binding_failure(port) [ 727.997354] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 727.997354] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] raise exception.PortBindingFailed(port_id=port['id']) [ 727.997354] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] nova.exception.PortBindingFailed: Binding failed for port 7a72d80e-8567-4d32-bb2b-d09fce05e08b, please check neutron logs for more information. [ 727.997354] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] [ 727.997668] env[59534]: INFO nova.compute.manager [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: f1e315bf-9348-4631-af77-85ec3a986a83] Terminating instance [ 727.998938] env[59534]: DEBUG oslo_concurrency.lockutils [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Acquiring lock "refresh_cache-f1e315bf-9348-4631-af77-85ec3a986a83" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 727.998938] env[59534]: DEBUG oslo_concurrency.lockutils [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Acquired lock "refresh_cache-f1e315bf-9348-4631-af77-85ec3a986a83" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 727.999074] env[59534]: DEBUG nova.network.neutron [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: f1e315bf-9348-4631-af77-85ec3a986a83] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 728.016917] env[59534]: DEBUG nova.network.neutron [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 728.026724] env[59534]: DEBUG oslo_concurrency.lockutils [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Releasing lock "refresh_cache-880bddac-daec-4194-8e5d-e7aaba8c2dd1" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 728.026930] env[59534]: DEBUG nova.compute.manager [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 728.028373] env[59534]: DEBUG nova.compute.manager [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 728.028373] env[59534]: DEBUG nova.network.neutron [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 728.054857] env[59534]: DEBUG nova.network.neutron [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: f1e315bf-9348-4631-af77-85ec3a986a83] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 728.081635] env[59534]: DEBUG nova.network.neutron [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 728.088500] env[59534]: DEBUG nova.network.neutron [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 728.098084] env[59534]: DEBUG nova.network.neutron [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 728.101431] env[59534]: INFO nova.compute.manager [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: 880bddac-daec-4194-8e5d-e7aaba8c2dd1] Took 0.07 seconds to deallocate network for instance. [ 728.110841] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Releasing lock "refresh_cache-de33d275-2cf8-4726-9d1a-526d7556a74f" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 728.111067] env[59534]: DEBUG nova.compute.manager [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 728.111247] env[59534]: DEBUG nova.compute.manager [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 728.111403] env[59534]: DEBUG nova.network.neutron [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 728.113707] env[59534]: DEBUG nova.network.neutron [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] Successfully created port: 8309cb18-2e4f-4e04-b760-280bae870627 {{(pid=59534) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 728.155729] env[59534]: DEBUG nova.network.neutron [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 728.165393] env[59534]: DEBUG nova.network.neutron [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 728.174940] env[59534]: INFO nova.compute.manager [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: de33d275-2cf8-4726-9d1a-526d7556a74f] Took 0.06 seconds to deallocate network for instance. [ 728.208134] env[59534]: INFO nova.scheduler.client.report [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Deleted allocations for instance 880bddac-daec-4194-8e5d-e7aaba8c2dd1 [ 728.225876] env[59534]: DEBUG oslo_concurrency.lockutils [None req-4155c5aa-fa12-458e-8d2f-45c321b48d0f tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Lock "880bddac-daec-4194-8e5d-e7aaba8c2dd1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 12.148s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 728.238241] env[59534]: DEBUG nova.compute.manager [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] Starting instance... {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 728.316024] env[59534]: INFO nova.scheduler.client.report [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Deleted allocations for instance de33d275-2cf8-4726-9d1a-526d7556a74f [ 728.338722] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Lock "de33d275-2cf8-4726-9d1a-526d7556a74f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 8.085s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 728.347361] env[59534]: DEBUG oslo_concurrency.lockutils [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 728.347361] env[59534]: DEBUG oslo_concurrency.lockutils [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 728.347361] env[59534]: INFO nova.compute.claims [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 728.355991] env[59534]: DEBUG nova.compute.manager [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] Starting instance... {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 728.423295] env[59534]: DEBUG oslo_concurrency.lockutils [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 728.439861] env[59534]: DEBUG oslo_concurrency.lockutils [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] Acquiring lock "57165716-986f-4582-ae73-e60fde250240" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 728.440270] env[59534]: DEBUG oslo_concurrency.lockutils [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] Lock "57165716-986f-4582-ae73-e60fde250240" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 728.573964] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-310912b7-460f-4e3d-a94f-5cc0d6d010bf {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.583886] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23e2ef10-740b-44ab-a8c0-c03f2f318536 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.617427] env[59534]: DEBUG nova.network.neutron [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: f1e315bf-9348-4631-af77-85ec3a986a83] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 728.619627] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7033930c-13bd-4b87-9fb8-db0b07a10298 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.630167] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5840920-a13d-4b6b-b4e4-cb86882845de {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.635877] env[59534]: DEBUG oslo_concurrency.lockutils [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Releasing lock "refresh_cache-f1e315bf-9348-4631-af77-85ec3a986a83" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 728.636801] env[59534]: DEBUG nova.compute.manager [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: f1e315bf-9348-4631-af77-85ec3a986a83] Start destroying the instance on the hypervisor. {{(pid=59534) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 728.638936] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: f1e315bf-9348-4631-af77-85ec3a986a83] Destroying instance {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 728.638936] env[59534]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-90c95cf8-24da-469d-a058-f89cb8b6d01e {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.652899] env[59534]: DEBUG nova.compute.provider_tree [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 728.661471] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7646c8bf-5339-43c2-953b-e1b728d85cb6 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.674811] env[59534]: DEBUG nova.scheduler.client.report [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 728.691211] env[59534]: WARNING nova.virt.vmwareapi.vmops [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: f1e315bf-9348-4631-af77-85ec3a986a83] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance f1e315bf-9348-4631-af77-85ec3a986a83 could not be found. [ 728.691437] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: f1e315bf-9348-4631-af77-85ec3a986a83] Instance destroyed {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 728.691610] env[59534]: INFO nova.compute.manager [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: f1e315bf-9348-4631-af77-85ec3a986a83] Took 0.05 seconds to destroy the instance on the hypervisor. [ 728.691848] env[59534]: DEBUG oslo.service.loopingcall [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59534) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 728.692110] env[59534]: DEBUG nova.compute.manager [-] [instance: f1e315bf-9348-4631-af77-85ec3a986a83] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 728.692186] env[59534]: DEBUG nova.network.neutron [-] [instance: f1e315bf-9348-4631-af77-85ec3a986a83] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 728.695647] env[59534]: DEBUG oslo_concurrency.lockutils [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.351s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 728.696115] env[59534]: DEBUG nova.compute.manager [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] Start building networks asynchronously for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 728.698911] env[59534]: DEBUG oslo_concurrency.lockutils [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.276s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 728.700379] env[59534]: INFO nova.compute.claims [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 728.735839] env[59534]: DEBUG nova.compute.utils [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] Using /dev/sd instead of None {{(pid=59534) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 728.738783] env[59534]: DEBUG nova.compute.manager [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] Allocating IP information in the background. {{(pid=59534) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 728.738783] env[59534]: DEBUG nova.network.neutron [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] allocate_for_instance() {{(pid=59534) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 728.742936] env[59534]: DEBUG nova.network.neutron [-] [instance: f1e315bf-9348-4631-af77-85ec3a986a83] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 728.749160] env[59534]: DEBUG nova.network.neutron [-] [instance: f1e315bf-9348-4631-af77-85ec3a986a83] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 728.758632] env[59534]: DEBUG nova.compute.manager [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] Start building block device mappings for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 728.762255] env[59534]: INFO nova.compute.manager [-] [instance: f1e315bf-9348-4631-af77-85ec3a986a83] Took 0.07 seconds to deallocate network for instance. [ 728.764233] env[59534]: DEBUG nova.compute.claims [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: f1e315bf-9348-4631-af77-85ec3a986a83] Aborting claim: {{(pid=59534) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 728.764963] env[59534]: DEBUG oslo_concurrency.lockutils [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 728.809389] env[59534]: ERROR nova.compute.manager [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 9baae0bb-f909-4f91-84e5-58b3057040a8, please check neutron logs for more information. [ 728.809389] env[59534]: ERROR nova.compute.manager Traceback (most recent call last): [ 728.809389] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 728.809389] env[59534]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 728.809389] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 728.809389] env[59534]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 728.809389] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 728.809389] env[59534]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 728.809389] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 728.809389] env[59534]: ERROR nova.compute.manager self.force_reraise() [ 728.809389] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 728.809389] env[59534]: ERROR nova.compute.manager raise self.value [ 728.809389] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 728.809389] env[59534]: ERROR nova.compute.manager updated_port = self._update_port( [ 728.809389] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 728.809389] env[59534]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 728.809850] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 728.809850] env[59534]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 728.809850] env[59534]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 9baae0bb-f909-4f91-84e5-58b3057040a8, please check neutron logs for more information. [ 728.809850] env[59534]: ERROR nova.compute.manager [ 728.809850] env[59534]: Traceback (most recent call last): [ 728.809850] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 728.809850] env[59534]: listener.cb(fileno) [ 728.809850] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 728.809850] env[59534]: result = function(*args, **kwargs) [ 728.809850] env[59534]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 728.809850] env[59534]: return func(*args, **kwargs) [ 728.809850] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 728.809850] env[59534]: raise e [ 728.809850] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 728.809850] env[59534]: nwinfo = self.network_api.allocate_for_instance( [ 728.809850] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 728.809850] env[59534]: created_port_ids = self._update_ports_for_instance( [ 728.809850] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 728.809850] env[59534]: with excutils.save_and_reraise_exception(): [ 728.809850] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 728.809850] env[59534]: self.force_reraise() [ 728.809850] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 728.809850] env[59534]: raise self.value [ 728.809850] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 728.809850] env[59534]: updated_port = self._update_port( [ 728.809850] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 728.809850] env[59534]: _ensure_no_port_binding_failure(port) [ 728.809850] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 728.809850] env[59534]: raise exception.PortBindingFailed(port_id=port['id']) [ 728.810562] env[59534]: nova.exception.PortBindingFailed: Binding failed for port 9baae0bb-f909-4f91-84e5-58b3057040a8, please check neutron logs for more information. [ 728.810562] env[59534]: Removing descriptor: 18 [ 728.810562] env[59534]: ERROR nova.compute.manager [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 9baae0bb-f909-4f91-84e5-58b3057040a8, please check neutron logs for more information. [ 728.810562] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] Traceback (most recent call last): [ 728.810562] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 728.810562] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] yield resources [ 728.810562] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 728.810562] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] self.driver.spawn(context, instance, image_meta, [ 728.810562] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 728.810562] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] self._vmops.spawn(context, instance, image_meta, injected_files, [ 728.810562] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 728.810562] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] vm_ref = self.build_virtual_machine(instance, [ 728.810892] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 728.810892] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] vif_infos = vmwarevif.get_vif_info(self._session, [ 728.810892] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 728.810892] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] for vif in network_info: [ 728.810892] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 728.810892] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] return self._sync_wrapper(fn, *args, **kwargs) [ 728.810892] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 728.810892] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] self.wait() [ 728.810892] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 728.810892] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] self[:] = self._gt.wait() [ 728.810892] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 728.810892] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] return self._exit_event.wait() [ 728.810892] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 728.810892] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] result = hub.switch() [ 728.811234] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 728.811234] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] return self.greenlet.switch() [ 728.811234] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 728.811234] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] result = function(*args, **kwargs) [ 728.811234] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 728.811234] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] return func(*args, **kwargs) [ 728.811234] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 728.811234] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] raise e [ 728.811234] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 728.811234] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] nwinfo = self.network_api.allocate_for_instance( [ 728.811234] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 728.811234] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] created_port_ids = self._update_ports_for_instance( [ 728.811234] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 728.811535] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] with excutils.save_and_reraise_exception(): [ 728.811535] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 728.811535] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] self.force_reraise() [ 728.811535] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 728.811535] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] raise self.value [ 728.811535] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 728.811535] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] updated_port = self._update_port( [ 728.811535] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 728.811535] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] _ensure_no_port_binding_failure(port) [ 728.811535] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 728.811535] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] raise exception.PortBindingFailed(port_id=port['id']) [ 728.811535] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] nova.exception.PortBindingFailed: Binding failed for port 9baae0bb-f909-4f91-84e5-58b3057040a8, please check neutron logs for more information. [ 728.811535] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] [ 728.811836] env[59534]: INFO nova.compute.manager [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] Terminating instance [ 728.811836] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Acquiring lock "refresh_cache-f4a120d9-3d50-4905-bcac-c6b632fa0295" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 728.811836] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Acquired lock "refresh_cache-f4a120d9-3d50-4905-bcac-c6b632fa0295" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 728.811836] env[59534]: DEBUG nova.network.neutron [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 728.840139] env[59534]: DEBUG nova.compute.manager [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] Start spawning the instance on the hypervisor. {{(pid=59534) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 728.852623] env[59534]: DEBUG nova.policy [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c7164050950444d6894f746057f08b8b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '887c3104a0114851982023ae4f923f34', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59534) authorize /opt/stack/nova/nova/policy.py:203}} [ 728.859529] env[59534]: DEBUG nova.network.neutron [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 728.868037] env[59534]: DEBUG nova.virt.hardware [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T19:44:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T19:44:03Z,direct_url=,disk_format='vmdk',id=ca8542a2-3ba7-4624-b2be-cd49a340ac21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='31a329da9a7b4c98a2734a8492bf7dec',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T19:44:04Z,virtual_size=,visibility=), allow threads: False {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 728.868291] env[59534]: DEBUG nova.virt.hardware [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] Flavor limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 728.868448] env[59534]: DEBUG nova.virt.hardware [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] Image limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 728.869729] env[59534]: DEBUG nova.virt.hardware [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] Flavor pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 728.869729] env[59534]: DEBUG nova.virt.hardware [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] Image pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 728.869729] env[59534]: DEBUG nova.virt.hardware [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 728.869729] env[59534]: DEBUG nova.virt.hardware [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 728.869729] env[59534]: DEBUG nova.virt.hardware [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 728.869971] env[59534]: DEBUG nova.virt.hardware [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] Got 1 possible topologies {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 728.869971] env[59534]: DEBUG nova.virt.hardware [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 728.870062] env[59534]: DEBUG nova.virt.hardware [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 728.871600] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9648e7c5-3548-4275-89a0-e41ec03c1a25 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.881438] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5094c47-5ff9-4a95-b532-bc4dd296177b {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.977570] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b44dc757-16eb-4585-9306-0369bee9b248 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.985833] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-574aa69e-1606-4b05-81bc-5f8eba1ece43 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.021381] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-323b841f-7710-40c9-8e30-aa6b1740fd92 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.031691] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1e2b417-03be-4afc-8613-8babd53d57bd {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.042376] env[59534]: DEBUG nova.compute.provider_tree [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 729.051678] env[59534]: DEBUG nova.scheduler.client.report [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 729.070630] env[59534]: DEBUG oslo_concurrency.lockutils [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.372s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 729.071129] env[59534]: DEBUG nova.compute.manager [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] Start building networks asynchronously for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 729.073437] env[59534]: DEBUG oslo_concurrency.lockutils [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.309s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 729.112058] env[59534]: DEBUG nova.compute.utils [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] Using /dev/sd instead of None {{(pid=59534) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 729.113550] env[59534]: DEBUG nova.compute.manager [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] Allocating IP information in the background. {{(pid=59534) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 729.113550] env[59534]: DEBUG nova.network.neutron [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] allocate_for_instance() {{(pid=59534) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 729.130473] env[59534]: DEBUG nova.compute.manager [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] Start building block device mappings for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 729.197020] env[59534]: DEBUG nova.network.neutron [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 729.207184] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Releasing lock "refresh_cache-f4a120d9-3d50-4905-bcac-c6b632fa0295" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 729.207565] env[59534]: DEBUG nova.compute.manager [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] Start destroying the instance on the hypervisor. {{(pid=59534) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 729.207753] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] Destroying instance {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 729.208287] env[59534]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-8e9153a8-0d62-4c73-a8dc-c629c14051e5 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.219107] env[59534]: DEBUG nova.compute.manager [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] Start spawning the instance on the hypervisor. {{(pid=59534) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 729.228127] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f951dee2-ca57-48e6-8fc0-0a1101afaaeb {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.250540] env[59534]: DEBUG nova.virt.hardware [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T19:44:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T19:44:03Z,direct_url=,disk_format='vmdk',id=ca8542a2-3ba7-4624-b2be-cd49a340ac21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='31a329da9a7b4c98a2734a8492bf7dec',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T19:44:04Z,virtual_size=,visibility=), allow threads: False {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 729.250919] env[59534]: DEBUG nova.virt.hardware [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] Flavor limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 729.251320] env[59534]: DEBUG nova.virt.hardware [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] Image limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 729.251919] env[59534]: DEBUG nova.virt.hardware [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] Flavor pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 729.251919] env[59534]: DEBUG nova.virt.hardware [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] Image pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 729.251919] env[59534]: DEBUG nova.virt.hardware [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 729.252370] env[59534]: DEBUG nova.virt.hardware [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 729.252741] env[59534]: DEBUG nova.virt.hardware [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 729.252969] env[59534]: DEBUG nova.virt.hardware [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] Got 1 possible topologies {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 729.253202] env[59534]: DEBUG nova.virt.hardware [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 729.253503] env[59534]: DEBUG nova.virt.hardware [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 729.255118] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c45cfd2-3c20-45ef-90a0-60fcf23e226d {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.267714] env[59534]: WARNING nova.virt.vmwareapi.vmops [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance f4a120d9-3d50-4905-bcac-c6b632fa0295 could not be found. [ 729.267945] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] Instance destroyed {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 729.268218] env[59534]: INFO nova.compute.manager [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] Took 0.06 seconds to destroy the instance on the hypervisor. [ 729.268504] env[59534]: DEBUG oslo.service.loopingcall [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59534) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 729.269299] env[59534]: DEBUG nova.compute.manager [-] [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 729.269488] env[59534]: DEBUG nova.network.neutron [-] [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 729.274920] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c700bd1-410e-499e-914d-d73eb92c4dbd {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.306721] env[59534]: DEBUG nova.policy [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5b953fe008184b81aeb499f4a9ae1937', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1461016ded45466183d6c24bf475f4fb', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59534) authorize /opt/stack/nova/nova/policy.py:203}} [ 729.310565] env[59534]: DEBUG nova.network.neutron [-] [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 729.321947] env[59534]: DEBUG nova.network.neutron [-] [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 729.335740] env[59534]: INFO nova.compute.manager [-] [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] Took 0.07 seconds to deallocate network for instance. [ 729.338529] env[59534]: DEBUG nova.compute.claims [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] Aborting claim: {{(pid=59534) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 729.338753] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 729.373214] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b6ea8de-5d6a-47e3-96f3-c7b3d8b8a28f {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.381219] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef5cb683-8f9e-4b4f-acfc-3c3b6df08229 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.412665] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4487cceb-5cba-4f4b-9e48-75cea258feb6 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.420282] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7bfc863a-1001-430e-90c3-1122a6c09892 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.434923] env[59534]: DEBUG nova.compute.provider_tree [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 729.438900] env[59534]: ERROR nova.compute.manager [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port f10be37c-9278-4524-a459-645ac91b41fa, please check neutron logs for more information. [ 729.438900] env[59534]: ERROR nova.compute.manager Traceback (most recent call last): [ 729.438900] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 729.438900] env[59534]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 729.438900] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 729.438900] env[59534]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 729.438900] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 729.438900] env[59534]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 729.438900] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 729.438900] env[59534]: ERROR nova.compute.manager self.force_reraise() [ 729.438900] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 729.438900] env[59534]: ERROR nova.compute.manager raise self.value [ 729.438900] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 729.438900] env[59534]: ERROR nova.compute.manager updated_port = self._update_port( [ 729.438900] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 729.438900] env[59534]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 729.439504] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 729.439504] env[59534]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 729.439504] env[59534]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port f10be37c-9278-4524-a459-645ac91b41fa, please check neutron logs for more information. [ 729.439504] env[59534]: ERROR nova.compute.manager [ 729.439504] env[59534]: Traceback (most recent call last): [ 729.439504] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 729.439504] env[59534]: listener.cb(fileno) [ 729.439504] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 729.439504] env[59534]: result = function(*args, **kwargs) [ 729.439504] env[59534]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 729.439504] env[59534]: return func(*args, **kwargs) [ 729.439504] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 729.439504] env[59534]: raise e [ 729.439504] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 729.439504] env[59534]: nwinfo = self.network_api.allocate_for_instance( [ 729.439504] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 729.439504] env[59534]: created_port_ids = self._update_ports_for_instance( [ 729.439504] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 729.439504] env[59534]: with excutils.save_and_reraise_exception(): [ 729.439504] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 729.439504] env[59534]: self.force_reraise() [ 729.439504] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 729.439504] env[59534]: raise self.value [ 729.439504] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 729.439504] env[59534]: updated_port = self._update_port( [ 729.439504] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 729.439504] env[59534]: _ensure_no_port_binding_failure(port) [ 729.439504] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 729.439504] env[59534]: raise exception.PortBindingFailed(port_id=port['id']) [ 729.440472] env[59534]: nova.exception.PortBindingFailed: Binding failed for port f10be37c-9278-4524-a459-645ac91b41fa, please check neutron logs for more information. [ 729.440472] env[59534]: Removing descriptor: 21 [ 729.440472] env[59534]: ERROR nova.compute.manager [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port f10be37c-9278-4524-a459-645ac91b41fa, please check neutron logs for more information. [ 729.440472] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] Traceback (most recent call last): [ 729.440472] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 729.440472] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] yield resources [ 729.440472] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 729.440472] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] self.driver.spawn(context, instance, image_meta, [ 729.440472] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 729.440472] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 729.440472] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 729.440472] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] vm_ref = self.build_virtual_machine(instance, [ 729.440884] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 729.440884] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] vif_infos = vmwarevif.get_vif_info(self._session, [ 729.440884] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 729.440884] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] for vif in network_info: [ 729.440884] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 729.440884] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] return self._sync_wrapper(fn, *args, **kwargs) [ 729.440884] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 729.440884] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] self.wait() [ 729.440884] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 729.440884] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] self[:] = self._gt.wait() [ 729.440884] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 729.440884] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] return self._exit_event.wait() [ 729.440884] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 729.440884] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] result = hub.switch() [ 729.441438] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 729.441438] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] return self.greenlet.switch() [ 729.441438] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 729.441438] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] result = function(*args, **kwargs) [ 729.441438] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 729.441438] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] return func(*args, **kwargs) [ 729.441438] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 729.441438] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] raise e [ 729.441438] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 729.441438] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] nwinfo = self.network_api.allocate_for_instance( [ 729.441438] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 729.441438] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] created_port_ids = self._update_ports_for_instance( [ 729.441438] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 729.442061] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] with excutils.save_and_reraise_exception(): [ 729.442061] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 729.442061] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] self.force_reraise() [ 729.442061] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 729.442061] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] raise self.value [ 729.442061] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 729.442061] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] updated_port = self._update_port( [ 729.442061] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 729.442061] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] _ensure_no_port_binding_failure(port) [ 729.442061] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 729.442061] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] raise exception.PortBindingFailed(port_id=port['id']) [ 729.442061] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] nova.exception.PortBindingFailed: Binding failed for port f10be37c-9278-4524-a459-645ac91b41fa, please check neutron logs for more information. [ 729.442061] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] [ 729.442521] env[59534]: INFO nova.compute.manager [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] Terminating instance [ 729.442521] env[59534]: DEBUG oslo_concurrency.lockutils [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Acquiring lock "refresh_cache-f3febdda-db75-4048-a2b5-aa4daf2dcfa2" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 729.442521] env[59534]: DEBUG oslo_concurrency.lockutils [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Acquired lock "refresh_cache-f3febdda-db75-4048-a2b5-aa4daf2dcfa2" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 729.442805] env[59534]: DEBUG nova.network.neutron [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 729.448177] env[59534]: DEBUG nova.scheduler.client.report [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 729.464477] env[59534]: DEBUG oslo_concurrency.lockutils [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.391s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 729.465319] env[59534]: ERROR nova.compute.manager [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: f1e315bf-9348-4631-af77-85ec3a986a83] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 7a72d80e-8567-4d32-bb2b-d09fce05e08b, please check neutron logs for more information. [ 729.465319] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] Traceback (most recent call last): [ 729.465319] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 729.465319] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] self.driver.spawn(context, instance, image_meta, [ 729.465319] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 729.465319] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] self._vmops.spawn(context, instance, image_meta, injected_files, [ 729.465319] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 729.465319] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] vm_ref = self.build_virtual_machine(instance, [ 729.465319] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 729.465319] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] vif_infos = vmwarevif.get_vif_info(self._session, [ 729.465319] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 729.466049] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] for vif in network_info: [ 729.466049] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 729.466049] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] return self._sync_wrapper(fn, *args, **kwargs) [ 729.466049] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 729.466049] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] self.wait() [ 729.466049] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 729.466049] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] self[:] = self._gt.wait() [ 729.466049] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 729.466049] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] return self._exit_event.wait() [ 729.466049] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 729.466049] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] result = hub.switch() [ 729.466049] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 729.466049] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] return self.greenlet.switch() [ 729.466049] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 729.466550] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] result = function(*args, **kwargs) [ 729.466550] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 729.466550] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] return func(*args, **kwargs) [ 729.466550] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 729.466550] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] raise e [ 729.466550] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 729.466550] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] nwinfo = self.network_api.allocate_for_instance( [ 729.466550] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 729.466550] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] created_port_ids = self._update_ports_for_instance( [ 729.466550] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 729.466550] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] with excutils.save_and_reraise_exception(): [ 729.466550] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 729.466550] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] self.force_reraise() [ 729.467018] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 729.467018] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] raise self.value [ 729.467018] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 729.467018] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] updated_port = self._update_port( [ 729.467018] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 729.467018] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] _ensure_no_port_binding_failure(port) [ 729.467018] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 729.467018] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] raise exception.PortBindingFailed(port_id=port['id']) [ 729.467018] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] nova.exception.PortBindingFailed: Binding failed for port 7a72d80e-8567-4d32-bb2b-d09fce05e08b, please check neutron logs for more information. [ 729.467018] env[59534]: ERROR nova.compute.manager [instance: f1e315bf-9348-4631-af77-85ec3a986a83] [ 729.467018] env[59534]: DEBUG nova.compute.utils [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: f1e315bf-9348-4631-af77-85ec3a986a83] Binding failed for port 7a72d80e-8567-4d32-bb2b-d09fce05e08b, please check neutron logs for more information. {{(pid=59534) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 729.467618] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.129s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 729.470889] env[59534]: DEBUG nova.compute.manager [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: f1e315bf-9348-4631-af77-85ec3a986a83] Build of instance f1e315bf-9348-4631-af77-85ec3a986a83 was re-scheduled: Binding failed for port 7a72d80e-8567-4d32-bb2b-d09fce05e08b, please check neutron logs for more information. {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 729.472658] env[59534]: DEBUG nova.compute.manager [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: f1e315bf-9348-4631-af77-85ec3a986a83] Unplugging VIFs for instance {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 729.472658] env[59534]: DEBUG oslo_concurrency.lockutils [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Acquiring lock "refresh_cache-f1e315bf-9348-4631-af77-85ec3a986a83" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 729.472658] env[59534]: DEBUG oslo_concurrency.lockutils [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Acquired lock "refresh_cache-f1e315bf-9348-4631-af77-85ec3a986a83" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 729.472658] env[59534]: DEBUG nova.network.neutron [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: f1e315bf-9348-4631-af77-85ec3a986a83] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 729.495035] env[59534]: DEBUG nova.network.neutron [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 729.525342] env[59534]: DEBUG nova.network.neutron [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: f1e315bf-9348-4631-af77-85ec3a986a83] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 729.684839] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76695e84-0d4a-430f-9e83-20b62bf480be {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.692503] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d2d2ea7-856c-42b2-adf2-b8fd5d1e183b {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.729706] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2412909d-9c9d-4198-b67d-07b7f3536cc2 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.737943] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ff7c057-145b-4021-97f3-3203a788b917 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.751293] env[59534]: DEBUG nova.compute.provider_tree [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 729.766903] env[59534]: DEBUG nova.scheduler.client.report [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 729.780272] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.313s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 729.780885] env[59534]: ERROR nova.compute.manager [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 9baae0bb-f909-4f91-84e5-58b3057040a8, please check neutron logs for more information. [ 729.780885] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] Traceback (most recent call last): [ 729.780885] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 729.780885] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] self.driver.spawn(context, instance, image_meta, [ 729.780885] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 729.780885] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] self._vmops.spawn(context, instance, image_meta, injected_files, [ 729.780885] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 729.780885] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] vm_ref = self.build_virtual_machine(instance, [ 729.780885] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 729.780885] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] vif_infos = vmwarevif.get_vif_info(self._session, [ 729.780885] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 729.781936] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] for vif in network_info: [ 729.781936] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 729.781936] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] return self._sync_wrapper(fn, *args, **kwargs) [ 729.781936] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 729.781936] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] self.wait() [ 729.781936] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 729.781936] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] self[:] = self._gt.wait() [ 729.781936] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 729.781936] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] return self._exit_event.wait() [ 729.781936] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 729.781936] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] result = hub.switch() [ 729.781936] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 729.781936] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] return self.greenlet.switch() [ 729.781936] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 729.782767] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] result = function(*args, **kwargs) [ 729.782767] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 729.782767] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] return func(*args, **kwargs) [ 729.782767] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 729.782767] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] raise e [ 729.782767] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 729.782767] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] nwinfo = self.network_api.allocate_for_instance( [ 729.782767] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 729.782767] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] created_port_ids = self._update_ports_for_instance( [ 729.782767] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 729.782767] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] with excutils.save_and_reraise_exception(): [ 729.782767] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 729.782767] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] self.force_reraise() [ 729.783424] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 729.783424] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] raise self.value [ 729.783424] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 729.783424] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] updated_port = self._update_port( [ 729.783424] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 729.783424] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] _ensure_no_port_binding_failure(port) [ 729.783424] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 729.783424] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] raise exception.PortBindingFailed(port_id=port['id']) [ 729.783424] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] nova.exception.PortBindingFailed: Binding failed for port 9baae0bb-f909-4f91-84e5-58b3057040a8, please check neutron logs for more information. [ 729.783424] env[59534]: ERROR nova.compute.manager [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] [ 729.783424] env[59534]: DEBUG nova.compute.utils [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] Binding failed for port 9baae0bb-f909-4f91-84e5-58b3057040a8, please check neutron logs for more information. {{(pid=59534) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 729.783849] env[59534]: DEBUG nova.compute.manager [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] Build of instance f4a120d9-3d50-4905-bcac-c6b632fa0295 was re-scheduled: Binding failed for port 9baae0bb-f909-4f91-84e5-58b3057040a8, please check neutron logs for more information. {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 729.784195] env[59534]: DEBUG nova.compute.manager [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] Unplugging VIFs for instance {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 729.784195] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Acquiring lock "refresh_cache-f4a120d9-3d50-4905-bcac-c6b632fa0295" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 729.784278] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Acquired lock "refresh_cache-f4a120d9-3d50-4905-bcac-c6b632fa0295" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 729.784402] env[59534]: DEBUG nova.network.neutron [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 729.863731] env[59534]: DEBUG nova.network.neutron [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] Successfully created port: e94d9514-fda5-4b4e-b0ec-bab350fab30b {{(pid=59534) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 730.058686] env[59534]: DEBUG nova.network.neutron [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 730.186451] env[59534]: DEBUG nova.network.neutron [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 730.198582] env[59534]: DEBUG oslo_concurrency.lockutils [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Releasing lock "refresh_cache-f3febdda-db75-4048-a2b5-aa4daf2dcfa2" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 730.199468] env[59534]: DEBUG nova.compute.manager [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] Start destroying the instance on the hypervisor. {{(pid=59534) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 730.199569] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] Destroying instance {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 730.200067] env[59534]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-64c6bb89-9c63-4009-8efb-c9786d92d144 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 730.212076] env[59534]: DEBUG nova.network.neutron [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: f1e315bf-9348-4631-af77-85ec3a986a83] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 730.216517] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78ffa85e-e41f-4122-b3fe-afac6b87c35c {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 730.229561] env[59534]: DEBUG oslo_concurrency.lockutils [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Releasing lock "refresh_cache-f1e315bf-9348-4631-af77-85ec3a986a83" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 730.230114] env[59534]: DEBUG nova.compute.manager [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 730.230366] env[59534]: DEBUG nova.compute.manager [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: f1e315bf-9348-4631-af77-85ec3a986a83] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 730.230603] env[59534]: DEBUG nova.network.neutron [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: f1e315bf-9348-4631-af77-85ec3a986a83] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 730.250719] env[59534]: WARNING nova.virt.vmwareapi.vmops [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance f3febdda-db75-4048-a2b5-aa4daf2dcfa2 could not be found. [ 730.250719] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] Instance destroyed {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 730.250719] env[59534]: INFO nova.compute.manager [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] Took 0.05 seconds to destroy the instance on the hypervisor. [ 730.251021] env[59534]: DEBUG oslo.service.loopingcall [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59534) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 730.251184] env[59534]: DEBUG nova.compute.manager [-] [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 730.251329] env[59534]: DEBUG nova.network.neutron [-] [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 730.282532] env[59534]: DEBUG nova.network.neutron [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: f1e315bf-9348-4631-af77-85ec3a986a83] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 730.293017] env[59534]: DEBUG nova.network.neutron [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: f1e315bf-9348-4631-af77-85ec3a986a83] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 730.302027] env[59534]: DEBUG nova.network.neutron [-] [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 730.310619] env[59534]: INFO nova.compute.manager [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: f1e315bf-9348-4631-af77-85ec3a986a83] Took 0.08 seconds to deallocate network for instance. [ 730.314113] env[59534]: DEBUG nova.network.neutron [-] [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 730.326980] env[59534]: INFO nova.compute.manager [-] [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] Took 0.08 seconds to deallocate network for instance. [ 730.330484] env[59534]: DEBUG nova.compute.claims [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] Aborting claim: {{(pid=59534) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 730.330662] env[59534]: DEBUG oslo_concurrency.lockutils [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 730.330868] env[59534]: DEBUG oslo_concurrency.lockutils [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 730.452442] env[59534]: DEBUG nova.network.neutron [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 730.462426] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Releasing lock "refresh_cache-f4a120d9-3d50-4905-bcac-c6b632fa0295" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 730.462659] env[59534]: DEBUG nova.compute.manager [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 730.462838] env[59534]: DEBUG nova.compute.manager [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 730.462995] env[59534]: DEBUG nova.network.neutron [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 730.476770] env[59534]: INFO nova.scheduler.client.report [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Deleted allocations for instance f1e315bf-9348-4631-af77-85ec3a986a83 [ 730.496234] env[59534]: DEBUG nova.network.neutron [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 730.505313] env[59534]: DEBUG oslo_concurrency.lockutils [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] Acquiring lock "192ab790-d9db-4990-93d0-24603bd65016" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 730.505544] env[59534]: DEBUG oslo_concurrency.lockutils [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] Lock "192ab790-d9db-4990-93d0-24603bd65016" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 730.509072] env[59534]: DEBUG oslo_concurrency.lockutils [None req-36c28f5a-ddf6-45c8-a08b-1dae74d610ca tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Lock "f1e315bf-9348-4631-af77-85ec3a986a83" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 11.980s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 730.515514] env[59534]: DEBUG nova.network.neutron [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 730.524257] env[59534]: DEBUG nova.compute.manager [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] Starting instance... {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 730.528460] env[59534]: INFO nova.compute.manager [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: f4a120d9-3d50-4905-bcac-c6b632fa0295] Took 0.07 seconds to deallocate network for instance. [ 730.597277] env[59534]: DEBUG oslo_concurrency.lockutils [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 730.639638] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5dc02d9b-3234-45ea-b81e-a67da27e6df7 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 730.648116] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d69929e-a838-4f28-8423-ee6947bd4bd6 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 730.652166] env[59534]: ERROR nova.compute.manager [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 8309cb18-2e4f-4e04-b760-280bae870627, please check neutron logs for more information. [ 730.652166] env[59534]: ERROR nova.compute.manager Traceback (most recent call last): [ 730.652166] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 730.652166] env[59534]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 730.652166] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 730.652166] env[59534]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 730.652166] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 730.652166] env[59534]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 730.652166] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 730.652166] env[59534]: ERROR nova.compute.manager self.force_reraise() [ 730.652166] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 730.652166] env[59534]: ERROR nova.compute.manager raise self.value [ 730.652166] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 730.652166] env[59534]: ERROR nova.compute.manager updated_port = self._update_port( [ 730.652166] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 730.652166] env[59534]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 730.652666] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 730.652666] env[59534]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 730.652666] env[59534]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 8309cb18-2e4f-4e04-b760-280bae870627, please check neutron logs for more information. [ 730.652666] env[59534]: ERROR nova.compute.manager [ 730.652666] env[59534]: Traceback (most recent call last): [ 730.652666] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 730.652666] env[59534]: listener.cb(fileno) [ 730.652666] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 730.652666] env[59534]: result = function(*args, **kwargs) [ 730.652666] env[59534]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 730.652666] env[59534]: return func(*args, **kwargs) [ 730.652666] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 730.652666] env[59534]: raise e [ 730.652666] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 730.652666] env[59534]: nwinfo = self.network_api.allocate_for_instance( [ 730.652666] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 730.652666] env[59534]: created_port_ids = self._update_ports_for_instance( [ 730.652666] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 730.652666] env[59534]: with excutils.save_and_reraise_exception(): [ 730.652666] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 730.652666] env[59534]: self.force_reraise() [ 730.652666] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 730.652666] env[59534]: raise self.value [ 730.652666] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 730.652666] env[59534]: updated_port = self._update_port( [ 730.652666] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 730.652666] env[59534]: _ensure_no_port_binding_failure(port) [ 730.652666] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 730.652666] env[59534]: raise exception.PortBindingFailed(port_id=port['id']) [ 730.653431] env[59534]: nova.exception.PortBindingFailed: Binding failed for port 8309cb18-2e4f-4e04-b760-280bae870627, please check neutron logs for more information. [ 730.653431] env[59534]: Removing descriptor: 15 [ 730.653431] env[59534]: ERROR nova.compute.manager [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 8309cb18-2e4f-4e04-b760-280bae870627, please check neutron logs for more information. [ 730.653431] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] Traceback (most recent call last): [ 730.653431] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 730.653431] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] yield resources [ 730.653431] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 730.653431] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] self.driver.spawn(context, instance, image_meta, [ 730.653431] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 730.653431] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 730.653431] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 730.653431] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] vm_ref = self.build_virtual_machine(instance, [ 730.653730] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 730.653730] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] vif_infos = vmwarevif.get_vif_info(self._session, [ 730.653730] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 730.653730] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] for vif in network_info: [ 730.653730] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 730.653730] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] return self._sync_wrapper(fn, *args, **kwargs) [ 730.653730] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 730.653730] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] self.wait() [ 730.653730] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 730.653730] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] self[:] = self._gt.wait() [ 730.653730] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 730.653730] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] return self._exit_event.wait() [ 730.653730] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 730.653730] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] result = hub.switch() [ 730.654074] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 730.654074] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] return self.greenlet.switch() [ 730.654074] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 730.654074] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] result = function(*args, **kwargs) [ 730.654074] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 730.654074] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] return func(*args, **kwargs) [ 730.654074] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 730.654074] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] raise e [ 730.654074] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 730.654074] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] nwinfo = self.network_api.allocate_for_instance( [ 730.654074] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 730.654074] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] created_port_ids = self._update_ports_for_instance( [ 730.654074] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 730.654379] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] with excutils.save_and_reraise_exception(): [ 730.654379] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 730.654379] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] self.force_reraise() [ 730.654379] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 730.654379] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] raise self.value [ 730.654379] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 730.654379] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] updated_port = self._update_port( [ 730.654379] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 730.654379] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] _ensure_no_port_binding_failure(port) [ 730.654379] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 730.654379] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] raise exception.PortBindingFailed(port_id=port['id']) [ 730.654379] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] nova.exception.PortBindingFailed: Binding failed for port 8309cb18-2e4f-4e04-b760-280bae870627, please check neutron logs for more information. [ 730.654379] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] [ 730.654701] env[59534]: INFO nova.compute.manager [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] Terminating instance [ 730.655692] env[59534]: DEBUG oslo_concurrency.lockutils [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] Acquiring lock "refresh_cache-e11cc8c7-86ac-49a6-b4b5-2daedf0066bb" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 730.655692] env[59534]: DEBUG oslo_concurrency.lockutils [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] Acquired lock "refresh_cache-e11cc8c7-86ac-49a6-b4b5-2daedf0066bb" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 730.655692] env[59534]: DEBUG nova.network.neutron [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 730.689295] env[59534]: DEBUG nova.network.neutron [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] Successfully created port: 19c303ea-3949-49ef-b1da-9130f02425b2 {{(pid=59534) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 730.691445] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a0c004d-7913-449c-b5b2-240114f9e339 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 730.703981] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34d05eb4-b4fd-425e-bbc4-da375c61ad6e {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 730.714433] env[59534]: INFO nova.scheduler.client.report [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Deleted allocations for instance f4a120d9-3d50-4905-bcac-c6b632fa0295 [ 730.726328] env[59534]: DEBUG nova.compute.provider_tree [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 730.729115] env[59534]: DEBUG nova.network.neutron [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 730.735280] env[59534]: DEBUG nova.scheduler.client.report [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 730.750377] env[59534]: DEBUG oslo_concurrency.lockutils [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.419s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 730.753743] env[59534]: ERROR nova.compute.manager [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port f10be37c-9278-4524-a459-645ac91b41fa, please check neutron logs for more information. [ 730.753743] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] Traceback (most recent call last): [ 730.753743] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 730.753743] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] self.driver.spawn(context, instance, image_meta, [ 730.753743] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 730.753743] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 730.753743] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 730.753743] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] vm_ref = self.build_virtual_machine(instance, [ 730.753743] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 730.753743] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] vif_infos = vmwarevif.get_vif_info(self._session, [ 730.753743] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 730.758176] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] for vif in network_info: [ 730.758176] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 730.758176] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] return self._sync_wrapper(fn, *args, **kwargs) [ 730.758176] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 730.758176] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] self.wait() [ 730.758176] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 730.758176] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] self[:] = self._gt.wait() [ 730.758176] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 730.758176] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] return self._exit_event.wait() [ 730.758176] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 730.758176] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] result = hub.switch() [ 730.758176] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 730.758176] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] return self.greenlet.switch() [ 730.758176] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 730.758695] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] result = function(*args, **kwargs) [ 730.758695] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 730.758695] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] return func(*args, **kwargs) [ 730.758695] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 730.758695] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] raise e [ 730.758695] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 730.758695] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] nwinfo = self.network_api.allocate_for_instance( [ 730.758695] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 730.758695] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] created_port_ids = self._update_ports_for_instance( [ 730.758695] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 730.758695] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] with excutils.save_and_reraise_exception(): [ 730.758695] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 730.758695] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] self.force_reraise() [ 730.759014] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 730.759014] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] raise self.value [ 730.759014] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 730.759014] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] updated_port = self._update_port( [ 730.759014] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 730.759014] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] _ensure_no_port_binding_failure(port) [ 730.759014] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 730.759014] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] raise exception.PortBindingFailed(port_id=port['id']) [ 730.759014] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] nova.exception.PortBindingFailed: Binding failed for port f10be37c-9278-4524-a459-645ac91b41fa, please check neutron logs for more information. [ 730.759014] env[59534]: ERROR nova.compute.manager [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] [ 730.759014] env[59534]: DEBUG nova.compute.utils [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] Binding failed for port f10be37c-9278-4524-a459-645ac91b41fa, please check neutron logs for more information. {{(pid=59534) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 730.759297] env[59534]: DEBUG oslo_concurrency.lockutils [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.159s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 730.759297] env[59534]: INFO nova.compute.claims [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 730.759637] env[59534]: DEBUG nova.compute.manager [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] Build of instance f3febdda-db75-4048-a2b5-aa4daf2dcfa2 was re-scheduled: Binding failed for port f10be37c-9278-4524-a459-645ac91b41fa, please check neutron logs for more information. {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 730.760111] env[59534]: DEBUG nova.compute.manager [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] Unplugging VIFs for instance {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 730.760325] env[59534]: DEBUG oslo_concurrency.lockutils [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Acquiring lock "refresh_cache-f3febdda-db75-4048-a2b5-aa4daf2dcfa2" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 730.760463] env[59534]: DEBUG oslo_concurrency.lockutils [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Acquired lock "refresh_cache-f3febdda-db75-4048-a2b5-aa4daf2dcfa2" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 730.760727] env[59534]: DEBUG nova.network.neutron [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 730.763538] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f8dd3caa-cdda-4335-b9df-9dec028340f4 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Lock "f4a120d9-3d50-4905-bcac-c6b632fa0295" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 10.559s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 730.778915] env[59534]: DEBUG nova.compute.manager [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] [instance: 57165716-986f-4582-ae73-e60fde250240] Starting instance... {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 730.844162] env[59534]: DEBUG nova.network.neutron [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 730.867571] env[59534]: DEBUG oslo_concurrency.lockutils [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 730.903556] env[59534]: DEBUG oslo_concurrency.lockutils [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Acquiring lock "744abb9c-379e-477b-a493-210cde36c314" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 730.903846] env[59534]: DEBUG oslo_concurrency.lockutils [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Lock "744abb9c-379e-477b-a493-210cde36c314" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 730.976083] env[59534]: DEBUG nova.network.neutron [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 730.986615] env[59534]: DEBUG oslo_concurrency.lockutils [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] Releasing lock "refresh_cache-e11cc8c7-86ac-49a6-b4b5-2daedf0066bb" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 730.986949] env[59534]: DEBUG nova.compute.manager [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] Start destroying the instance on the hypervisor. {{(pid=59534) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 730.987143] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] Destroying instance {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 730.987670] env[59534]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-5aacef2c-ad45-4e26-9216-c3453d7106a1 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.000036] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6a63315-e9b3-4ff9-90b4-0c159e0bdc31 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.025584] env[59534]: WARNING nova.virt.vmwareapi.vmops [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance e11cc8c7-86ac-49a6-b4b5-2daedf0066bb could not be found. [ 731.025817] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] Instance destroyed {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 731.026012] env[59534]: INFO nova.compute.manager [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] Took 0.04 seconds to destroy the instance on the hypervisor. [ 731.026290] env[59534]: DEBUG oslo.service.loopingcall [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59534) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 731.027363] env[59534]: DEBUG nova.compute.manager [-] [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 731.027468] env[59534]: DEBUG nova.network.neutron [-] [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 731.029682] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7203893-c440-4cb8-8720-7b999263c446 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.038569] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10ab666f-6d2e-449a-bdfd-c7bbf2a408a1 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.070394] env[59534]: DEBUG nova.network.neutron [-] [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 731.072094] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71876c91-9bc7-4f9f-b0df-a98862773587 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.082891] env[59534]: DEBUG nova.network.neutron [-] [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 731.088130] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed3a5fee-266b-4513-b37d-1b2e601431ad {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.093632] env[59534]: INFO nova.compute.manager [-] [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] Took 0.07 seconds to deallocate network for instance. [ 731.104857] env[59534]: DEBUG nova.compute.provider_tree [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 731.104857] env[59534]: DEBUG nova.compute.claims [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] Aborting claim: {{(pid=59534) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 731.105087] env[59534]: DEBUG oslo_concurrency.lockutils [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 731.112867] env[59534]: DEBUG nova.scheduler.client.report [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 731.126090] env[59534]: DEBUG oslo_concurrency.lockutils [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.370s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 731.126562] env[59534]: DEBUG nova.compute.manager [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] Start building networks asynchronously for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 731.128820] env[59534]: DEBUG oslo_concurrency.lockutils [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.261s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 731.130181] env[59534]: INFO nova.compute.claims [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] [instance: 57165716-986f-4582-ae73-e60fde250240] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 731.163152] env[59534]: DEBUG nova.compute.utils [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Using /dev/sd instead of None {{(pid=59534) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 731.164446] env[59534]: DEBUG nova.compute.manager [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] Allocating IP information in the background. {{(pid=59534) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 731.164620] env[59534]: DEBUG nova.network.neutron [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] allocate_for_instance() {{(pid=59534) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 731.173846] env[59534]: DEBUG nova.compute.manager [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] Start building block device mappings for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 731.278688] env[59534]: DEBUG nova.compute.manager [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] Start spawning the instance on the hypervisor. {{(pid=59534) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 731.299478] env[59534]: DEBUG nova.virt.hardware [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T19:44:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T19:44:03Z,direct_url=,disk_format='vmdk',id=ca8542a2-3ba7-4624-b2be-cd49a340ac21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='31a329da9a7b4c98a2734a8492bf7dec',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T19:44:04Z,virtual_size=,visibility=), allow threads: False {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 731.299715] env[59534]: DEBUG nova.virt.hardware [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Flavor limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 731.299866] env[59534]: DEBUG nova.virt.hardware [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Image limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 731.300055] env[59534]: DEBUG nova.virt.hardware [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Flavor pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 731.300209] env[59534]: DEBUG nova.virt.hardware [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Image pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 731.300343] env[59534]: DEBUG nova.virt.hardware [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 731.300541] env[59534]: DEBUG nova.virt.hardware [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 731.300694] env[59534]: DEBUG nova.virt.hardware [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 731.300851] env[59534]: DEBUG nova.virt.hardware [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Got 1 possible topologies {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 731.301835] env[59534]: DEBUG nova.virt.hardware [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 731.301835] env[59534]: DEBUG nova.virt.hardware [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 731.302112] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f42f107-daab-412f-986d-46de885d4790 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.313565] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-589c15b4-d30c-4e6b-a546-d09b3078b6af {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.318322] env[59534]: DEBUG nova.policy [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6f0a9d41caa34cf8a03d948175f148f5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b53e0ad3e7a2485ba3547c57d84748ef', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59534) authorize /opt/stack/nova/nova/policy.py:203}} [ 731.335616] env[59534]: DEBUG nova.network.neutron [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 731.345776] env[59534]: DEBUG oslo_concurrency.lockutils [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Releasing lock "refresh_cache-f3febdda-db75-4048-a2b5-aa4daf2dcfa2" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 731.345965] env[59534]: DEBUG nova.compute.manager [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 731.346152] env[59534]: DEBUG nova.compute.manager [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 731.346328] env[59534]: DEBUG nova.network.neutron [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 731.357909] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16b58d5a-e8e7-4191-9dc0-e1e9b00d2499 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.365775] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59672f05-4777-4e5d-bca4-54c06691b5fb {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.395666] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4cf16e4e-a461-4239-ab01-d0acfea0ef67 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.403239] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5493fa09-a9e0-4163-9d33-e8357f1445b2 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.416372] env[59534]: DEBUG nova.compute.provider_tree [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 731.425525] env[59534]: DEBUG nova.scheduler.client.report [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 731.442588] env[59534]: DEBUG oslo_concurrency.lockutils [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.314s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 731.443106] env[59534]: DEBUG nova.compute.manager [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] [instance: 57165716-986f-4582-ae73-e60fde250240] Start building networks asynchronously for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 731.445461] env[59534]: DEBUG oslo_concurrency.lockutils [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.341s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 731.483021] env[59534]: DEBUG nova.compute.utils [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] Using /dev/sd instead of None {{(pid=59534) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 731.484224] env[59534]: DEBUG nova.compute.manager [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] [instance: 57165716-986f-4582-ae73-e60fde250240] Allocating IP information in the background. {{(pid=59534) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 731.484401] env[59534]: DEBUG nova.network.neutron [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] [instance: 57165716-986f-4582-ae73-e60fde250240] allocate_for_instance() {{(pid=59534) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 731.494170] env[59534]: DEBUG nova.compute.manager [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] [instance: 57165716-986f-4582-ae73-e60fde250240] Start building block device mappings for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 731.564425] env[59534]: DEBUG nova.compute.manager [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] [instance: 57165716-986f-4582-ae73-e60fde250240] Start spawning the instance on the hypervisor. {{(pid=59534) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 731.587711] env[59534]: DEBUG nova.virt.hardware [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T19:44:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T19:44:03Z,direct_url=,disk_format='vmdk',id=ca8542a2-3ba7-4624-b2be-cd49a340ac21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='31a329da9a7b4c98a2734a8492bf7dec',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T19:44:04Z,virtual_size=,visibility=), allow threads: False {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 731.587888] env[59534]: DEBUG nova.virt.hardware [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] Flavor limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 731.588093] env[59534]: DEBUG nova.virt.hardware [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] Image limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 731.588289] env[59534]: DEBUG nova.virt.hardware [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] Flavor pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 731.588433] env[59534]: DEBUG nova.virt.hardware [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] Image pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 731.588574] env[59534]: DEBUG nova.virt.hardware [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 731.588773] env[59534]: DEBUG nova.virt.hardware [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 731.588924] env[59534]: DEBUG nova.virt.hardware [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 731.589093] env[59534]: DEBUG nova.virt.hardware [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] Got 1 possible topologies {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 731.589250] env[59534]: DEBUG nova.virt.hardware [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 731.589451] env[59534]: DEBUG nova.virt.hardware [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 731.590311] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fae94d29-2105-4562-9a25-4e40c226044b {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.600754] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b044962-9047-45ef-a2d4-5ade59d25b5f {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.622545] env[59534]: DEBUG nova.network.neutron [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 731.633043] env[59534]: DEBUG nova.network.neutron [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 731.645530] env[59534]: INFO nova.compute.manager [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] [instance: f3febdda-db75-4048-a2b5-aa4daf2dcfa2] Took 0.30 seconds to deallocate network for instance. [ 731.684665] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-133733a5-5707-4378-a2cd-4befc574a13f {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.691959] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c89aa6b-f9f5-4a11-9251-3b0200a53926 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.721795] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0f8578b-601b-49eb-ae56-bd0575e9ed95 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.732251] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bbf21dd8-90bc-458c-91ec-03b2ca812a6d {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.747733] env[59534]: DEBUG nova.compute.provider_tree [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 731.749608] env[59534]: INFO nova.scheduler.client.report [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Deleted allocations for instance f3febdda-db75-4048-a2b5-aa4daf2dcfa2 [ 731.755667] env[59534]: DEBUG nova.scheduler.client.report [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 731.770057] env[59534]: DEBUG oslo_concurrency.lockutils [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.324s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 731.770552] env[59534]: ERROR nova.compute.manager [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 8309cb18-2e4f-4e04-b760-280bae870627, please check neutron logs for more information. [ 731.770552] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] Traceback (most recent call last): [ 731.770552] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 731.770552] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] self.driver.spawn(context, instance, image_meta, [ 731.770552] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 731.770552] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 731.770552] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 731.770552] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] vm_ref = self.build_virtual_machine(instance, [ 731.770552] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 731.770552] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] vif_infos = vmwarevif.get_vif_info(self._session, [ 731.770552] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 731.770866] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] for vif in network_info: [ 731.770866] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 731.770866] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] return self._sync_wrapper(fn, *args, **kwargs) [ 731.770866] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 731.770866] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] self.wait() [ 731.770866] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 731.770866] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] self[:] = self._gt.wait() [ 731.770866] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 731.770866] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] return self._exit_event.wait() [ 731.770866] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 731.770866] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] result = hub.switch() [ 731.770866] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 731.770866] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] return self.greenlet.switch() [ 731.770866] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 731.771239] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] result = function(*args, **kwargs) [ 731.771239] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 731.771239] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] return func(*args, **kwargs) [ 731.771239] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 731.771239] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] raise e [ 731.771239] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 731.771239] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] nwinfo = self.network_api.allocate_for_instance( [ 731.771239] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 731.771239] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] created_port_ids = self._update_ports_for_instance( [ 731.771239] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 731.771239] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] with excutils.save_and_reraise_exception(): [ 731.771239] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 731.771239] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] self.force_reraise() [ 731.771565] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 731.771565] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] raise self.value [ 731.771565] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 731.771565] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] updated_port = self._update_port( [ 731.771565] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 731.771565] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] _ensure_no_port_binding_failure(port) [ 731.771565] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 731.771565] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] raise exception.PortBindingFailed(port_id=port['id']) [ 731.771565] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] nova.exception.PortBindingFailed: Binding failed for port 8309cb18-2e4f-4e04-b760-280bae870627, please check neutron logs for more information. [ 731.771565] env[59534]: ERROR nova.compute.manager [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] [ 731.771565] env[59534]: DEBUG nova.compute.utils [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] Binding failed for port 8309cb18-2e4f-4e04-b760-280bae870627, please check neutron logs for more information. {{(pid=59534) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 731.774690] env[59534]: DEBUG nova.compute.manager [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] Build of instance e11cc8c7-86ac-49a6-b4b5-2daedf0066bb was re-scheduled: Binding failed for port 8309cb18-2e4f-4e04-b760-280bae870627, please check neutron logs for more information. {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 731.775115] env[59534]: DEBUG nova.compute.manager [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] Unplugging VIFs for instance {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 731.775392] env[59534]: DEBUG oslo_concurrency.lockutils [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] Acquiring lock "refresh_cache-e11cc8c7-86ac-49a6-b4b5-2daedf0066bb" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 731.776159] env[59534]: DEBUG oslo_concurrency.lockutils [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] Acquired lock "refresh_cache-e11cc8c7-86ac-49a6-b4b5-2daedf0066bb" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 731.776159] env[59534]: DEBUG nova.network.neutron [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 731.790570] env[59534]: DEBUG oslo_concurrency.lockutils [None req-69a96172-a4d6-4946-b602-31d353a5cddf tempest-ListServerFiltersTestJSON-712311738 tempest-ListServerFiltersTestJSON-712311738-project-member] Lock "f3febdda-db75-4048-a2b5-aa4daf2dcfa2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 10.377s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 731.801141] env[59534]: DEBUG nova.compute.manager [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] [instance: 192ab790-d9db-4990-93d0-24603bd65016] Starting instance... {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 731.851384] env[59534]: ERROR nova.compute.manager [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 7ee485bb-e516-4317-bb2a-05b5ab3d6574, please check neutron logs for more information. [ 731.851384] env[59534]: ERROR nova.compute.manager Traceback (most recent call last): [ 731.851384] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 731.851384] env[59534]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 731.851384] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 731.851384] env[59534]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 731.851384] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 731.851384] env[59534]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 731.851384] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 731.851384] env[59534]: ERROR nova.compute.manager self.force_reraise() [ 731.851384] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 731.851384] env[59534]: ERROR nova.compute.manager raise self.value [ 731.851384] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 731.851384] env[59534]: ERROR nova.compute.manager updated_port = self._update_port( [ 731.851384] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 731.851384] env[59534]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 731.851861] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 731.851861] env[59534]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 731.851861] env[59534]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 7ee485bb-e516-4317-bb2a-05b5ab3d6574, please check neutron logs for more information. [ 731.851861] env[59534]: ERROR nova.compute.manager [ 731.851861] env[59534]: Traceback (most recent call last): [ 731.851861] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 731.851861] env[59534]: listener.cb(fileno) [ 731.851861] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 731.851861] env[59534]: result = function(*args, **kwargs) [ 731.851861] env[59534]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 731.851861] env[59534]: return func(*args, **kwargs) [ 731.851861] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 731.851861] env[59534]: raise e [ 731.851861] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 731.851861] env[59534]: nwinfo = self.network_api.allocate_for_instance( [ 731.851861] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 731.851861] env[59534]: created_port_ids = self._update_ports_for_instance( [ 731.851861] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 731.851861] env[59534]: with excutils.save_and_reraise_exception(): [ 731.851861] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 731.851861] env[59534]: self.force_reraise() [ 731.851861] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 731.851861] env[59534]: raise self.value [ 731.851861] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 731.851861] env[59534]: updated_port = self._update_port( [ 731.851861] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 731.851861] env[59534]: _ensure_no_port_binding_failure(port) [ 731.851861] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 731.851861] env[59534]: raise exception.PortBindingFailed(port_id=port['id']) [ 731.852563] env[59534]: nova.exception.PortBindingFailed: Binding failed for port 7ee485bb-e516-4317-bb2a-05b5ab3d6574, please check neutron logs for more information. [ 731.852563] env[59534]: Removing descriptor: 16 [ 731.852563] env[59534]: ERROR nova.compute.manager [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 7ee485bb-e516-4317-bb2a-05b5ab3d6574, please check neutron logs for more information. [ 731.852563] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] Traceback (most recent call last): [ 731.852563] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 731.852563] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] yield resources [ 731.852563] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 731.852563] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] self.driver.spawn(context, instance, image_meta, [ 731.852563] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 731.852563] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] self._vmops.spawn(context, instance, image_meta, injected_files, [ 731.852563] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 731.852563] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] vm_ref = self.build_virtual_machine(instance, [ 731.852880] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 731.852880] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] vif_infos = vmwarevif.get_vif_info(self._session, [ 731.852880] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 731.852880] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] for vif in network_info: [ 731.852880] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 731.852880] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] return self._sync_wrapper(fn, *args, **kwargs) [ 731.852880] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 731.852880] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] self.wait() [ 731.852880] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 731.852880] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] self[:] = self._gt.wait() [ 731.852880] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 731.852880] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] return self._exit_event.wait() [ 731.852880] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 731.852880] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] result = hub.switch() [ 731.853327] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 731.853327] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] return self.greenlet.switch() [ 731.853327] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 731.853327] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] result = function(*args, **kwargs) [ 731.853327] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 731.853327] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] return func(*args, **kwargs) [ 731.853327] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 731.853327] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] raise e [ 731.853327] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 731.853327] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] nwinfo = self.network_api.allocate_for_instance( [ 731.853327] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 731.853327] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] created_port_ids = self._update_ports_for_instance( [ 731.853327] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 731.853669] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] with excutils.save_and_reraise_exception(): [ 731.853669] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 731.853669] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] self.force_reraise() [ 731.853669] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 731.853669] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] raise self.value [ 731.853669] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 731.853669] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] updated_port = self._update_port( [ 731.853669] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 731.853669] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] _ensure_no_port_binding_failure(port) [ 731.853669] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 731.853669] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] raise exception.PortBindingFailed(port_id=port['id']) [ 731.853669] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] nova.exception.PortBindingFailed: Binding failed for port 7ee485bb-e516-4317-bb2a-05b5ab3d6574, please check neutron logs for more information. [ 731.853669] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] [ 731.854012] env[59534]: INFO nova.compute.manager [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] Terminating instance [ 731.855930] env[59534]: DEBUG oslo_concurrency.lockutils [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 731.857033] env[59534]: DEBUG oslo_concurrency.lockutils [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 731.858101] env[59534]: INFO nova.compute.claims [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] [instance: 192ab790-d9db-4990-93d0-24603bd65016] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 731.860495] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] Acquiring lock "refresh_cache-3a20b8f8-f10c-4303-b801-e0831da74f91" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 731.860653] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] Acquired lock "refresh_cache-3a20b8f8-f10c-4303-b801-e0831da74f91" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 731.860813] env[59534]: DEBUG nova.network.neutron [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 731.876987] env[59534]: DEBUG nova.network.neutron [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 731.881119] env[59534]: DEBUG nova.policy [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd42a951140ab44a391b89f74aa9ac2fb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '06f260ca811e4dada55813b2b8656839', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59534) authorize /opt/stack/nova/nova/policy.py:203}} [ 731.958559] env[59534]: DEBUG nova.network.neutron [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 732.071347] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-75179c07-9044-4876-a2b5-04d726139855 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 732.080639] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bac299fe-97fd-41b0-ade5-f6641037d2fd {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 732.112188] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e3c0c2d-567a-46bb-9a2c-426fb5c484b9 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 732.120737] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a924e06f-17bb-4245-a77a-6c5ed381da9b {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 732.137345] env[59534]: DEBUG nova.compute.provider_tree [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 732.148852] env[59534]: DEBUG nova.scheduler.client.report [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 732.165376] env[59534]: DEBUG oslo_concurrency.lockutils [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.309s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 732.168837] env[59534]: DEBUG nova.compute.manager [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] [instance: 192ab790-d9db-4990-93d0-24603bd65016] Start building networks asynchronously for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 732.207465] env[59534]: DEBUG nova.compute.utils [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] Using /dev/sd instead of None {{(pid=59534) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 732.208797] env[59534]: DEBUG nova.compute.manager [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] [instance: 192ab790-d9db-4990-93d0-24603bd65016] Allocating IP information in the background. {{(pid=59534) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 732.208958] env[59534]: DEBUG nova.network.neutron [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] [instance: 192ab790-d9db-4990-93d0-24603bd65016] allocate_for_instance() {{(pid=59534) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 732.218736] env[59534]: DEBUG nova.compute.manager [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] [instance: 192ab790-d9db-4990-93d0-24603bd65016] Start building block device mappings for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 732.293880] env[59534]: DEBUG nova.compute.manager [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] [instance: 192ab790-d9db-4990-93d0-24603bd65016] Start spawning the instance on the hypervisor. {{(pid=59534) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 732.321454] env[59534]: DEBUG nova.virt.hardware [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T19:44:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T19:44:03Z,direct_url=,disk_format='vmdk',id=ca8542a2-3ba7-4624-b2be-cd49a340ac21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='31a329da9a7b4c98a2734a8492bf7dec',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T19:44:04Z,virtual_size=,visibility=), allow threads: False {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 732.321690] env[59534]: DEBUG nova.virt.hardware [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] Flavor limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 732.321843] env[59534]: DEBUG nova.virt.hardware [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] Image limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 732.322093] env[59534]: DEBUG nova.virt.hardware [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] Flavor pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 732.322266] env[59534]: DEBUG nova.virt.hardware [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] Image pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 732.322416] env[59534]: DEBUG nova.virt.hardware [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 732.322620] env[59534]: DEBUG nova.virt.hardware [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 732.323037] env[59534]: DEBUG nova.virt.hardware [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 732.323037] env[59534]: DEBUG nova.virt.hardware [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] Got 1 possible topologies {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 732.323141] env[59534]: DEBUG nova.virt.hardware [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 732.323251] env[59534]: DEBUG nova.virt.hardware [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 732.324111] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd386e6a-7947-455c-adbe-8c9a26384a95 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 732.334023] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67d744c4-c03c-41dc-a201-a89f50bd1166 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 732.394617] env[59534]: DEBUG nova.network.neutron [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] Successfully created port: 5449ec5f-79d3-4359-b75c-1286d8c3a380 {{(pid=59534) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 732.423827] env[59534]: DEBUG nova.network.neutron [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 732.435191] env[59534]: DEBUG oslo_concurrency.lockutils [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] Releasing lock "refresh_cache-e11cc8c7-86ac-49a6-b4b5-2daedf0066bb" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 732.435428] env[59534]: DEBUG nova.compute.manager [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 732.435610] env[59534]: DEBUG nova.compute.manager [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 732.435772] env[59534]: DEBUG nova.network.neutron [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 732.455311] env[59534]: DEBUG nova.network.neutron [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 732.466780] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] Releasing lock "refresh_cache-3a20b8f8-f10c-4303-b801-e0831da74f91" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 732.466780] env[59534]: DEBUG nova.compute.manager [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] Start destroying the instance on the hypervisor. {{(pid=59534) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 732.466780] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] Destroying instance {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 732.467272] env[59534]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-def8e09f-0648-4dd4-b740-787df131d275 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 732.476697] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b462a63-40f8-495a-aa8c-83798a59fa7a {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 732.500494] env[59534]: WARNING nova.virt.vmwareapi.vmops [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 3a20b8f8-f10c-4303-b801-e0831da74f91 could not be found. [ 732.500725] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] Instance destroyed {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 732.500905] env[59534]: INFO nova.compute.manager [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] Took 0.03 seconds to destroy the instance on the hypervisor. [ 732.501251] env[59534]: DEBUG oslo.service.loopingcall [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59534) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 732.501527] env[59534]: DEBUG nova.compute.manager [-] [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 732.501599] env[59534]: DEBUG nova.network.neutron [-] [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 732.691073] env[59534]: ERROR nova.compute.manager [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port e94d9514-fda5-4b4e-b0ec-bab350fab30b, please check neutron logs for more information. [ 732.691073] env[59534]: ERROR nova.compute.manager Traceback (most recent call last): [ 732.691073] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 732.691073] env[59534]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 732.691073] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 732.691073] env[59534]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 732.691073] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 732.691073] env[59534]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 732.691073] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 732.691073] env[59534]: ERROR nova.compute.manager self.force_reraise() [ 732.691073] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 732.691073] env[59534]: ERROR nova.compute.manager raise self.value [ 732.691073] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 732.691073] env[59534]: ERROR nova.compute.manager updated_port = self._update_port( [ 732.691073] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 732.691073] env[59534]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 732.691522] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 732.691522] env[59534]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 732.691522] env[59534]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port e94d9514-fda5-4b4e-b0ec-bab350fab30b, please check neutron logs for more information. [ 732.691522] env[59534]: ERROR nova.compute.manager [ 732.691522] env[59534]: Traceback (most recent call last): [ 732.691522] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 732.691522] env[59534]: listener.cb(fileno) [ 732.691522] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 732.691522] env[59534]: result = function(*args, **kwargs) [ 732.691522] env[59534]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 732.691522] env[59534]: return func(*args, **kwargs) [ 732.691522] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 732.691522] env[59534]: raise e [ 732.691522] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 732.691522] env[59534]: nwinfo = self.network_api.allocate_for_instance( [ 732.691522] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 732.691522] env[59534]: created_port_ids = self._update_ports_for_instance( [ 732.691522] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 732.691522] env[59534]: with excutils.save_and_reraise_exception(): [ 732.691522] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 732.691522] env[59534]: self.force_reraise() [ 732.691522] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 732.691522] env[59534]: raise self.value [ 732.691522] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 732.691522] env[59534]: updated_port = self._update_port( [ 732.691522] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 732.691522] env[59534]: _ensure_no_port_binding_failure(port) [ 732.691522] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 732.691522] env[59534]: raise exception.PortBindingFailed(port_id=port['id']) [ 732.692229] env[59534]: nova.exception.PortBindingFailed: Binding failed for port e94d9514-fda5-4b4e-b0ec-bab350fab30b, please check neutron logs for more information. [ 732.692229] env[59534]: Removing descriptor: 19 [ 732.692229] env[59534]: ERROR nova.compute.manager [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port e94d9514-fda5-4b4e-b0ec-bab350fab30b, please check neutron logs for more information. [ 732.692229] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] Traceback (most recent call last): [ 732.692229] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 732.692229] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] yield resources [ 732.692229] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 732.692229] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] self.driver.spawn(context, instance, image_meta, [ 732.692229] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 732.692229] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] self._vmops.spawn(context, instance, image_meta, injected_files, [ 732.692229] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 732.692229] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] vm_ref = self.build_virtual_machine(instance, [ 732.692573] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 732.692573] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] vif_infos = vmwarevif.get_vif_info(self._session, [ 732.692573] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 732.692573] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] for vif in network_info: [ 732.692573] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 732.692573] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] return self._sync_wrapper(fn, *args, **kwargs) [ 732.692573] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 732.692573] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] self.wait() [ 732.692573] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 732.692573] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] self[:] = self._gt.wait() [ 732.692573] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 732.692573] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] return self._exit_event.wait() [ 732.692573] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 732.692573] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] result = hub.switch() [ 732.692905] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 732.692905] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] return self.greenlet.switch() [ 732.692905] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 732.692905] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] result = function(*args, **kwargs) [ 732.692905] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 732.692905] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] return func(*args, **kwargs) [ 732.692905] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 732.692905] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] raise e [ 732.692905] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 732.692905] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] nwinfo = self.network_api.allocate_for_instance( [ 732.692905] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 732.692905] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] created_port_ids = self._update_ports_for_instance( [ 732.692905] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 732.693250] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] with excutils.save_and_reraise_exception(): [ 732.693250] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 732.693250] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] self.force_reraise() [ 732.693250] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 732.693250] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] raise self.value [ 732.693250] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 732.693250] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] updated_port = self._update_port( [ 732.693250] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 732.693250] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] _ensure_no_port_binding_failure(port) [ 732.693250] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 732.693250] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] raise exception.PortBindingFailed(port_id=port['id']) [ 732.693250] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] nova.exception.PortBindingFailed: Binding failed for port e94d9514-fda5-4b4e-b0ec-bab350fab30b, please check neutron logs for more information. [ 732.693250] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] [ 732.693558] env[59534]: INFO nova.compute.manager [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] Terminating instance [ 732.695533] env[59534]: DEBUG oslo_concurrency.lockutils [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] Acquiring lock "refresh_cache-c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 732.695695] env[59534]: DEBUG oslo_concurrency.lockutils [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] Acquired lock "refresh_cache-c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 732.696275] env[59534]: DEBUG nova.network.neutron [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 732.701099] env[59534]: DEBUG nova.network.neutron [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 732.719988] env[59534]: DEBUG nova.network.neutron [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 732.731228] env[59534]: INFO nova.compute.manager [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] [instance: e11cc8c7-86ac-49a6-b4b5-2daedf0066bb] Took 0.30 seconds to deallocate network for instance. [ 732.790662] env[59534]: DEBUG nova.network.neutron [-] [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 732.806019] env[59534]: DEBUG nova.network.neutron [-] [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 732.807378] env[59534]: DEBUG nova.policy [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0082b0572d2848f794c057a93291ae86', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4632fed38799437594e5019a85a7bf81', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59534) authorize /opt/stack/nova/nova/policy.py:203}} [ 732.812802] env[59534]: DEBUG nova.network.neutron [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 732.818947] env[59534]: INFO nova.compute.manager [-] [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] Took 0.32 seconds to deallocate network for instance. [ 732.824731] env[59534]: DEBUG nova.compute.claims [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] Aborting claim: {{(pid=59534) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 732.824731] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 732.824909] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 732.855202] env[59534]: INFO nova.scheduler.client.report [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] Deleted allocations for instance e11cc8c7-86ac-49a6-b4b5-2daedf0066bb [ 732.886442] env[59534]: DEBUG oslo_concurrency.lockutils [None req-494505ca-89ce-4b08-9a78-9771aed9b9e1 tempest-ServersTestFqdnHostnames-1395823399 tempest-ServersTestFqdnHostnames-1395823399-project-member] Lock "e11cc8c7-86ac-49a6-b4b5-2daedf0066bb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 7.810s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 732.905320] env[59534]: DEBUG nova.compute.manager [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 744abb9c-379e-477b-a493-210cde36c314] Starting instance... {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 732.961536] env[59534]: DEBUG oslo_concurrency.lockutils [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 733.026554] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-abacc580-957a-4fd5-920c-82566c64f4a8 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.035909] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a28bcf7d-6b44-4029-a53d-5a194fc9fe6e {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.066722] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b6868f3-5776-446a-bb74-f15ddd4ed537 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.074263] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2aaaed3-9b1b-4ec3-b849-34904adfd9d9 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.088022] env[59534]: DEBUG nova.compute.provider_tree [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 733.099309] env[59534]: DEBUG nova.scheduler.client.report [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 733.118018] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.292s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 733.118018] env[59534]: ERROR nova.compute.manager [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 7ee485bb-e516-4317-bb2a-05b5ab3d6574, please check neutron logs for more information. [ 733.118018] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] Traceback (most recent call last): [ 733.118018] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 733.118018] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] self.driver.spawn(context, instance, image_meta, [ 733.118018] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 733.118018] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] self._vmops.spawn(context, instance, image_meta, injected_files, [ 733.118018] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 733.118018] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] vm_ref = self.build_virtual_machine(instance, [ 733.118402] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 733.118402] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] vif_infos = vmwarevif.get_vif_info(self._session, [ 733.118402] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 733.118402] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] for vif in network_info: [ 733.118402] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 733.118402] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] return self._sync_wrapper(fn, *args, **kwargs) [ 733.118402] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 733.118402] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] self.wait() [ 733.118402] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 733.118402] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] self[:] = self._gt.wait() [ 733.118402] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 733.118402] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] return self._exit_event.wait() [ 733.118402] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 733.118402] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] result = hub.switch() [ 733.118797] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 733.118797] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] return self.greenlet.switch() [ 733.118797] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 733.118797] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] result = function(*args, **kwargs) [ 733.118797] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 733.118797] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] return func(*args, **kwargs) [ 733.118797] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 733.118797] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] raise e [ 733.118797] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 733.118797] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] nwinfo = self.network_api.allocate_for_instance( [ 733.118797] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 733.118797] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] created_port_ids = self._update_ports_for_instance( [ 733.118797] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 733.119216] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] with excutils.save_and_reraise_exception(): [ 733.119216] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 733.119216] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] self.force_reraise() [ 733.119216] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 733.119216] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] raise self.value [ 733.119216] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 733.119216] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] updated_port = self._update_port( [ 733.119216] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 733.119216] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] _ensure_no_port_binding_failure(port) [ 733.119216] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 733.119216] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] raise exception.PortBindingFailed(port_id=port['id']) [ 733.119216] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] nova.exception.PortBindingFailed: Binding failed for port 7ee485bb-e516-4317-bb2a-05b5ab3d6574, please check neutron logs for more information. [ 733.119216] env[59534]: ERROR nova.compute.manager [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] [ 733.119584] env[59534]: DEBUG nova.compute.utils [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] Binding failed for port 7ee485bb-e516-4317-bb2a-05b5ab3d6574, please check neutron logs for more information. {{(pid=59534) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 733.120766] env[59534]: DEBUG oslo_concurrency.lockutils [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.159s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 733.121634] env[59534]: INFO nova.compute.claims [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 744abb9c-379e-477b-a493-210cde36c314] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 733.124566] env[59534]: DEBUG nova.compute.manager [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] Build of instance 3a20b8f8-f10c-4303-b801-e0831da74f91 was re-scheduled: Binding failed for port 7ee485bb-e516-4317-bb2a-05b5ab3d6574, please check neutron logs for more information. {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 733.124845] env[59534]: DEBUG nova.compute.manager [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] Unplugging VIFs for instance {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 733.125088] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] Acquiring lock "refresh_cache-3a20b8f8-f10c-4303-b801-e0831da74f91" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 733.125261] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] Acquired lock "refresh_cache-3a20b8f8-f10c-4303-b801-e0831da74f91" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 733.125428] env[59534]: DEBUG nova.network.neutron [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 733.206078] env[59534]: DEBUG nova.network.neutron [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 733.334983] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85ac65bf-2196-4049-a614-4a2129039a25 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.342829] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8d3f908-3899-4c0d-8f0a-15a54bc5f065 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.375218] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bda62dab-37a0-462f-b423-5c7f1e2fa29c {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.380264] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0dc03a9-cb3c-45fe-a247-d437cb572597 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.393569] env[59534]: DEBUG nova.compute.provider_tree [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 733.403602] env[59534]: DEBUG nova.scheduler.client.report [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 733.422022] env[59534]: DEBUG oslo_concurrency.lockutils [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.301s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 733.422022] env[59534]: DEBUG nova.compute.manager [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 744abb9c-379e-477b-a493-210cde36c314] Start building networks asynchronously for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 733.425319] env[59534]: DEBUG nova.network.neutron [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 733.446493] env[59534]: DEBUG oslo_concurrency.lockutils [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] Releasing lock "refresh_cache-c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 733.446493] env[59534]: DEBUG nova.compute.manager [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] Start destroying the instance on the hypervisor. {{(pid=59534) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 733.446493] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] Destroying instance {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 733.446493] env[59534]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-1cf2fc99-9a45-440e-ba3c-3219da1f16f6 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.455833] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0dcb8141-063c-4b5d-b797-95bee4577d88 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.474765] env[59534]: DEBUG nova.compute.utils [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Using /dev/sd instead of None {{(pid=59534) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 733.485637] env[59534]: DEBUG nova.compute.manager [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 744abb9c-379e-477b-a493-210cde36c314] Allocating IP information in the background. {{(pid=59534) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 733.485808] env[59534]: DEBUG nova.network.neutron [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 744abb9c-379e-477b-a493-210cde36c314] allocate_for_instance() {{(pid=59534) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 733.487897] env[59534]: WARNING nova.virt.vmwareapi.vmops [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402 could not be found. [ 733.488106] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] Instance destroyed {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 733.488438] env[59534]: INFO nova.compute.manager [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] Took 0.04 seconds to destroy the instance on the hypervisor. [ 733.488612] env[59534]: DEBUG oslo.service.loopingcall [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59534) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 733.489178] env[59534]: DEBUG nova.compute.manager [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 744abb9c-379e-477b-a493-210cde36c314] Start building block device mappings for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 733.491894] env[59534]: DEBUG nova.compute.manager [-] [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 733.491986] env[59534]: DEBUG nova.network.neutron [-] [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 733.528585] env[59534]: DEBUG nova.network.neutron [-] [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 733.536653] env[59534]: DEBUG nova.network.neutron [-] [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 733.555636] env[59534]: INFO nova.compute.manager [-] [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] Took 0.06 seconds to deallocate network for instance. [ 733.557757] env[59534]: DEBUG nova.compute.claims [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] Aborting claim: {{(pid=59534) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 733.557921] env[59534]: DEBUG oslo_concurrency.lockutils [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 733.558198] env[59534]: DEBUG oslo_concurrency.lockutils [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 733.569138] env[59534]: DEBUG nova.compute.manager [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 744abb9c-379e-477b-a493-210cde36c314] Start spawning the instance on the hypervisor. {{(pid=59534) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 733.596741] env[59534]: DEBUG nova.virt.hardware [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T19:44:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T19:44:03Z,direct_url=,disk_format='vmdk',id=ca8542a2-3ba7-4624-b2be-cd49a340ac21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='31a329da9a7b4c98a2734a8492bf7dec',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T19:44:04Z,virtual_size=,visibility=), allow threads: False {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 733.596990] env[59534]: DEBUG nova.virt.hardware [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Flavor limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 733.597158] env[59534]: DEBUG nova.virt.hardware [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Image limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 733.597349] env[59534]: DEBUG nova.virt.hardware [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Flavor pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 733.597492] env[59534]: DEBUG nova.virt.hardware [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Image pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 733.597649] env[59534]: DEBUG nova.virt.hardware [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 733.597832] env[59534]: DEBUG nova.virt.hardware [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 733.597983] env[59534]: DEBUG nova.virt.hardware [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 733.598666] env[59534]: DEBUG nova.virt.hardware [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Got 1 possible topologies {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 733.598934] env[59534]: DEBUG nova.virt.hardware [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 733.599179] env[59534]: DEBUG nova.virt.hardware [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 733.600203] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f32ced6-cc10-4cfc-a5d3-7181b96296d4 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.612553] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71cdd751-9114-4d08-8166-1ce542824f85 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.780517] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8368eaf7-79e9-41ce-8efd-5689a6264a0c {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.789332] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e8c92f2-098f-478a-8a41-a0b593c4a572 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.820498] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e63ff6c1-3fee-47b1-8b64-4c2afd5955f3 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.828870] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99df2f3e-7b9d-4f90-b090-fe31cc331d6e {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.844153] env[59534]: DEBUG nova.compute.provider_tree [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 733.853173] env[59534]: DEBUG nova.scheduler.client.report [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 733.866362] env[59534]: DEBUG oslo_concurrency.lockutils [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.308s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 733.866955] env[59534]: ERROR nova.compute.manager [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port e94d9514-fda5-4b4e-b0ec-bab350fab30b, please check neutron logs for more information. [ 733.866955] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] Traceback (most recent call last): [ 733.866955] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 733.866955] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] self.driver.spawn(context, instance, image_meta, [ 733.866955] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 733.866955] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] self._vmops.spawn(context, instance, image_meta, injected_files, [ 733.866955] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 733.866955] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] vm_ref = self.build_virtual_machine(instance, [ 733.866955] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 733.866955] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] vif_infos = vmwarevif.get_vif_info(self._session, [ 733.866955] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 733.867273] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] for vif in network_info: [ 733.867273] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 733.867273] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] return self._sync_wrapper(fn, *args, **kwargs) [ 733.867273] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 733.867273] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] self.wait() [ 733.867273] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 733.867273] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] self[:] = self._gt.wait() [ 733.867273] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 733.867273] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] return self._exit_event.wait() [ 733.867273] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 733.867273] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] result = hub.switch() [ 733.867273] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 733.867273] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] return self.greenlet.switch() [ 733.867273] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 733.867612] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] result = function(*args, **kwargs) [ 733.867612] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 733.867612] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] return func(*args, **kwargs) [ 733.867612] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 733.867612] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] raise e [ 733.867612] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 733.867612] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] nwinfo = self.network_api.allocate_for_instance( [ 733.867612] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 733.867612] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] created_port_ids = self._update_ports_for_instance( [ 733.867612] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 733.867612] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] with excutils.save_and_reraise_exception(): [ 733.867612] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 733.867612] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] self.force_reraise() [ 733.867971] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 733.867971] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] raise self.value [ 733.867971] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 733.867971] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] updated_port = self._update_port( [ 733.867971] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 733.867971] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] _ensure_no_port_binding_failure(port) [ 733.867971] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 733.867971] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] raise exception.PortBindingFailed(port_id=port['id']) [ 733.867971] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] nova.exception.PortBindingFailed: Binding failed for port e94d9514-fda5-4b4e-b0ec-bab350fab30b, please check neutron logs for more information. [ 733.867971] env[59534]: ERROR nova.compute.manager [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] [ 733.867971] env[59534]: DEBUG nova.compute.utils [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] Binding failed for port e94d9514-fda5-4b4e-b0ec-bab350fab30b, please check neutron logs for more information. {{(pid=59534) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 733.869044] env[59534]: DEBUG nova.compute.manager [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] Build of instance c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402 was re-scheduled: Binding failed for port e94d9514-fda5-4b4e-b0ec-bab350fab30b, please check neutron logs for more information. {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 733.869462] env[59534]: DEBUG nova.compute.manager [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] Unplugging VIFs for instance {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 733.869678] env[59534]: DEBUG oslo_concurrency.lockutils [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] Acquiring lock "refresh_cache-c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 733.869823] env[59534]: DEBUG oslo_concurrency.lockutils [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] Acquired lock "refresh_cache-c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 733.869975] env[59534]: DEBUG nova.network.neutron [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 733.915758] env[59534]: DEBUG nova.network.neutron [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 733.964982] env[59534]: DEBUG nova.network.neutron [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 733.975027] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] Releasing lock "refresh_cache-3a20b8f8-f10c-4303-b801-e0831da74f91" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 733.975302] env[59534]: DEBUG nova.compute.manager [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 733.975490] env[59534]: DEBUG nova.compute.manager [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 733.975657] env[59534]: DEBUG nova.network.neutron [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 733.979583] env[59534]: DEBUG nova.policy [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a5c59c008b2e44389ed9e37eb461a0c5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0c0b17586dcd47b9b08c331a00083433', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59534) authorize /opt/stack/nova/nova/policy.py:203}} [ 734.052456] env[59534]: DEBUG nova.network.neutron [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 734.059904] env[59534]: DEBUG nova.network.neutron [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 734.067289] env[59534]: DEBUG nova.network.neutron [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] [instance: 57165716-986f-4582-ae73-e60fde250240] Successfully created port: 364b7b07-d7bf-4503-b03c-8a3c16403c51 {{(pid=59534) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 734.076278] env[59534]: INFO nova.compute.manager [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] [instance: 3a20b8f8-f10c-4303-b801-e0831da74f91] Took 0.10 seconds to deallocate network for instance. [ 734.171870] env[59534]: INFO nova.scheduler.client.report [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] Deleted allocations for instance 3a20b8f8-f10c-4303-b801-e0831da74f91 [ 734.193705] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f2058984-c25d-4e88-b377-ff4a61fe0be1 tempest-ServersTestJSON-865559043 tempest-ServersTestJSON-865559043-project-member] Lock "3a20b8f8-f10c-4303-b801-e0831da74f91" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 12.074s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 734.262745] env[59534]: DEBUG nova.network.neutron [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 734.272670] env[59534]: DEBUG oslo_concurrency.lockutils [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] Releasing lock "refresh_cache-c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 734.272924] env[59534]: DEBUG nova.compute.manager [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 734.273092] env[59534]: DEBUG nova.compute.manager [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 734.273286] env[59534]: DEBUG nova.network.neutron [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 734.326573] env[59534]: DEBUG nova.network.neutron [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 734.331724] env[59534]: DEBUG nova.network.neutron [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 734.342573] env[59534]: INFO nova.compute.manager [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] [instance: c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402] Took 0.07 seconds to deallocate network for instance. [ 734.431649] env[59534]: INFO nova.scheduler.client.report [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] Deleted allocations for instance c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402 [ 734.454092] env[59534]: DEBUG oslo_concurrency.lockutils [None req-2b8d91f4-0074-4dc5-bcfb-534197c6a44b tempest-ImagesNegativeTestJSON-55314154 tempest-ImagesNegativeTestJSON-55314154-project-member] Lock "c3ddcc9b-7856-4ce0-8a5c-6f5f145b2402" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 8.092s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 734.663191] env[59534]: DEBUG nova.network.neutron [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] [instance: 192ab790-d9db-4990-93d0-24603bd65016] Successfully created port: 289d3b0f-d24b-44cd-8e43-fb69290023da {{(pid=59534) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 735.800372] env[59534]: ERROR nova.compute.manager [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 5449ec5f-79d3-4359-b75c-1286d8c3a380, please check neutron logs for more information. [ 735.800372] env[59534]: ERROR nova.compute.manager Traceback (most recent call last): [ 735.800372] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 735.800372] env[59534]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 735.800372] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 735.800372] env[59534]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 735.800372] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 735.800372] env[59534]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 735.800372] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 735.800372] env[59534]: ERROR nova.compute.manager self.force_reraise() [ 735.800372] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 735.800372] env[59534]: ERROR nova.compute.manager raise self.value [ 735.800372] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 735.800372] env[59534]: ERROR nova.compute.manager updated_port = self._update_port( [ 735.800372] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 735.800372] env[59534]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 735.802205] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 735.802205] env[59534]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 735.802205] env[59534]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 5449ec5f-79d3-4359-b75c-1286d8c3a380, please check neutron logs for more information. [ 735.802205] env[59534]: ERROR nova.compute.manager [ 735.802205] env[59534]: Traceback (most recent call last): [ 735.802205] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 735.802205] env[59534]: listener.cb(fileno) [ 735.802205] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 735.802205] env[59534]: result = function(*args, **kwargs) [ 735.802205] env[59534]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 735.802205] env[59534]: return func(*args, **kwargs) [ 735.802205] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 735.802205] env[59534]: raise e [ 735.802205] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 735.802205] env[59534]: nwinfo = self.network_api.allocate_for_instance( [ 735.802205] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 735.802205] env[59534]: created_port_ids = self._update_ports_for_instance( [ 735.802205] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 735.802205] env[59534]: with excutils.save_and_reraise_exception(): [ 735.802205] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 735.802205] env[59534]: self.force_reraise() [ 735.802205] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 735.802205] env[59534]: raise self.value [ 735.802205] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 735.802205] env[59534]: updated_port = self._update_port( [ 735.802205] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 735.802205] env[59534]: _ensure_no_port_binding_failure(port) [ 735.802205] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 735.802205] env[59534]: raise exception.PortBindingFailed(port_id=port['id']) [ 735.804125] env[59534]: nova.exception.PortBindingFailed: Binding failed for port 5449ec5f-79d3-4359-b75c-1286d8c3a380, please check neutron logs for more information. [ 735.804125] env[59534]: Removing descriptor: 15 [ 735.804125] env[59534]: ERROR nova.compute.manager [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 5449ec5f-79d3-4359-b75c-1286d8c3a380, please check neutron logs for more information. [ 735.804125] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] Traceback (most recent call last): [ 735.804125] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 735.804125] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] yield resources [ 735.804125] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 735.804125] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] self.driver.spawn(context, instance, image_meta, [ 735.804125] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 735.804125] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] self._vmops.spawn(context, instance, image_meta, injected_files, [ 735.804125] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 735.804125] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] vm_ref = self.build_virtual_machine(instance, [ 735.805147] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 735.805147] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] vif_infos = vmwarevif.get_vif_info(self._session, [ 735.805147] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 735.805147] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] for vif in network_info: [ 735.805147] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 735.805147] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] return self._sync_wrapper(fn, *args, **kwargs) [ 735.805147] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 735.805147] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] self.wait() [ 735.805147] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 735.805147] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] self[:] = self._gt.wait() [ 735.805147] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 735.805147] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] return self._exit_event.wait() [ 735.805147] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 735.805147] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] result = hub.switch() [ 735.806645] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 735.806645] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] return self.greenlet.switch() [ 735.806645] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 735.806645] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] result = function(*args, **kwargs) [ 735.806645] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 735.806645] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] return func(*args, **kwargs) [ 735.806645] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 735.806645] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] raise e [ 735.806645] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 735.806645] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] nwinfo = self.network_api.allocate_for_instance( [ 735.806645] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 735.806645] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] created_port_ids = self._update_ports_for_instance( [ 735.806645] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 735.807046] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] with excutils.save_and_reraise_exception(): [ 735.807046] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 735.807046] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] self.force_reraise() [ 735.807046] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 735.807046] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] raise self.value [ 735.807046] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 735.807046] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] updated_port = self._update_port( [ 735.807046] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 735.807046] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] _ensure_no_port_binding_failure(port) [ 735.807046] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 735.807046] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] raise exception.PortBindingFailed(port_id=port['id']) [ 735.807046] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] nova.exception.PortBindingFailed: Binding failed for port 5449ec5f-79d3-4359-b75c-1286d8c3a380, please check neutron logs for more information. [ 735.807046] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] [ 735.807386] env[59534]: INFO nova.compute.manager [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] Terminating instance [ 735.807386] env[59534]: DEBUG oslo_concurrency.lockutils [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Acquiring lock "refresh_cache-55002608-7ede-4f13-a820-09f8b7da3edf" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 735.807386] env[59534]: DEBUG oslo_concurrency.lockutils [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Acquired lock "refresh_cache-55002608-7ede-4f13-a820-09f8b7da3edf" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 735.807386] env[59534]: DEBUG nova.network.neutron [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 735.835755] env[59534]: DEBUG nova.network.neutron [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 735.841727] env[59534]: DEBUG nova.network.neutron [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 744abb9c-379e-477b-a493-210cde36c314] Successfully created port: da9d1c71-80d0-4a68-88b1-7bb68e089ee6 {{(pid=59534) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 736.137225] env[59534]: DEBUG nova.network.neutron [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 736.149700] env[59534]: DEBUG oslo_concurrency.lockutils [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Releasing lock "refresh_cache-55002608-7ede-4f13-a820-09f8b7da3edf" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 736.150125] env[59534]: DEBUG nova.compute.manager [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] Start destroying the instance on the hypervisor. {{(pid=59534) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 736.150367] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] Destroying instance {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 736.150954] env[59534]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-0e4bc7b8-83ec-42f9-9345-b4fd5c73e693 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 736.162846] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd547f61-c12a-454d-9bea-93c4a6fdb38c {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 736.190977] env[59534]: WARNING nova.virt.vmwareapi.vmops [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 55002608-7ede-4f13-a820-09f8b7da3edf could not be found. [ 736.191283] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] Instance destroyed {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 736.191492] env[59534]: INFO nova.compute.manager [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] Took 0.04 seconds to destroy the instance on the hypervisor. [ 736.191761] env[59534]: DEBUG oslo.service.loopingcall [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59534) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 736.192044] env[59534]: DEBUG nova.compute.manager [-] [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 736.192176] env[59534]: DEBUG nova.network.neutron [-] [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 736.214930] env[59534]: DEBUG nova.network.neutron [-] [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 736.222496] env[59534]: DEBUG nova.network.neutron [-] [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 736.231743] env[59534]: INFO nova.compute.manager [-] [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] Took 0.04 seconds to deallocate network for instance. [ 736.234007] env[59534]: DEBUG nova.compute.claims [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] Aborting claim: {{(pid=59534) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 736.234194] env[59534]: DEBUG oslo_concurrency.lockutils [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 736.234406] env[59534]: DEBUG oslo_concurrency.lockutils [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 736.380069] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac13f2f5-62f6-4098-84f4-7e055727ddcd {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 736.389280] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29979408-b2c9-41d3-9dd8-ff8fae60212d {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 736.424356] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79623667-f8bf-4f6c-8564-a12effc7bba5 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 736.432493] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63950c06-19f7-425e-9bf1-e1953701d8c0 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 736.446161] env[59534]: DEBUG nova.compute.provider_tree [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 736.455173] env[59534]: DEBUG nova.scheduler.client.report [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 736.473243] env[59534]: DEBUG oslo_concurrency.lockutils [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.237s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 736.473243] env[59534]: ERROR nova.compute.manager [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 5449ec5f-79d3-4359-b75c-1286d8c3a380, please check neutron logs for more information. [ 736.473243] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] Traceback (most recent call last): [ 736.473243] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 736.473243] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] self.driver.spawn(context, instance, image_meta, [ 736.473243] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 736.473243] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] self._vmops.spawn(context, instance, image_meta, injected_files, [ 736.473243] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 736.473243] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] vm_ref = self.build_virtual_machine(instance, [ 736.473615] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 736.473615] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] vif_infos = vmwarevif.get_vif_info(self._session, [ 736.473615] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 736.473615] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] for vif in network_info: [ 736.473615] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 736.473615] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] return self._sync_wrapper(fn, *args, **kwargs) [ 736.473615] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 736.473615] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] self.wait() [ 736.473615] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 736.473615] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] self[:] = self._gt.wait() [ 736.473615] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 736.473615] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] return self._exit_event.wait() [ 736.473615] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 736.473615] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] result = hub.switch() [ 736.474043] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 736.474043] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] return self.greenlet.switch() [ 736.474043] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 736.474043] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] result = function(*args, **kwargs) [ 736.474043] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 736.474043] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] return func(*args, **kwargs) [ 736.474043] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 736.474043] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] raise e [ 736.474043] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 736.474043] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] nwinfo = self.network_api.allocate_for_instance( [ 736.474043] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 736.474043] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] created_port_ids = self._update_ports_for_instance( [ 736.474043] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 736.474409] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] with excutils.save_and_reraise_exception(): [ 736.474409] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 736.474409] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] self.force_reraise() [ 736.474409] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 736.474409] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] raise self.value [ 736.474409] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 736.474409] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] updated_port = self._update_port( [ 736.474409] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 736.474409] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] _ensure_no_port_binding_failure(port) [ 736.474409] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 736.474409] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] raise exception.PortBindingFailed(port_id=port['id']) [ 736.474409] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] nova.exception.PortBindingFailed: Binding failed for port 5449ec5f-79d3-4359-b75c-1286d8c3a380, please check neutron logs for more information. [ 736.474409] env[59534]: ERROR nova.compute.manager [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] [ 736.474769] env[59534]: DEBUG nova.compute.utils [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] Binding failed for port 5449ec5f-79d3-4359-b75c-1286d8c3a380, please check neutron logs for more information. {{(pid=59534) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 736.477018] env[59534]: DEBUG nova.compute.manager [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] Build of instance 55002608-7ede-4f13-a820-09f8b7da3edf was re-scheduled: Binding failed for port 5449ec5f-79d3-4359-b75c-1286d8c3a380, please check neutron logs for more information. {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 736.477018] env[59534]: DEBUG nova.compute.manager [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] Unplugging VIFs for instance {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 736.477018] env[59534]: DEBUG oslo_concurrency.lockutils [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Acquiring lock "refresh_cache-55002608-7ede-4f13-a820-09f8b7da3edf" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 736.477018] env[59534]: DEBUG oslo_concurrency.lockutils [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Acquired lock "refresh_cache-55002608-7ede-4f13-a820-09f8b7da3edf" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 736.477241] env[59534]: DEBUG nova.network.neutron [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 736.505813] env[59534]: DEBUG nova.network.neutron [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 736.744763] env[59534]: DEBUG nova.network.neutron [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 736.754588] env[59534]: DEBUG oslo_concurrency.lockutils [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Releasing lock "refresh_cache-55002608-7ede-4f13-a820-09f8b7da3edf" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 736.754588] env[59534]: DEBUG nova.compute.manager [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 736.754588] env[59534]: DEBUG nova.compute.manager [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 736.754693] env[59534]: DEBUG nova.network.neutron [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 736.782134] env[59534]: DEBUG nova.network.neutron [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 736.791897] env[59534]: DEBUG nova.network.neutron [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 736.800524] env[59534]: INFO nova.compute.manager [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 55002608-7ede-4f13-a820-09f8b7da3edf] Took 0.05 seconds to deallocate network for instance. [ 736.896118] env[59534]: INFO nova.scheduler.client.report [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Deleted allocations for instance 55002608-7ede-4f13-a820-09f8b7da3edf [ 736.920778] env[59534]: DEBUG oslo_concurrency.lockutils [None req-006a62de-e4e6-4a9c-b507-e4e760611339 tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Lock "55002608-7ede-4f13-a820-09f8b7da3edf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 8.987s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 737.509630] env[59534]: WARNING oslo_vmware.rw_handles [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 737.509630] env[59534]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 737.509630] env[59534]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 737.509630] env[59534]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 737.509630] env[59534]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 737.509630] env[59534]: ERROR oslo_vmware.rw_handles response.begin() [ 737.509630] env[59534]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 737.509630] env[59534]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 737.509630] env[59534]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 737.509630] env[59534]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 737.509630] env[59534]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 737.509630] env[59534]: ERROR oslo_vmware.rw_handles [ 737.510034] env[59534]: DEBUG nova.virt.vmwareapi.images [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] Downloaded image file data ca8542a2-3ba7-4624-b2be-cd49a340ac21 to vmware_temp/31d0921e-7f23-4b18-a5c3-60a9191494cd/ca8542a2-3ba7-4624-b2be-cd49a340ac21/tmp-sparse.vmdk on the data store datastore1 {{(pid=59534) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 737.516578] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] Caching image {{(pid=59534) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 737.516894] env[59534]: DEBUG nova.virt.vmwareapi.vm_util [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Copying Virtual Disk [datastore1] vmware_temp/31d0921e-7f23-4b18-a5c3-60a9191494cd/ca8542a2-3ba7-4624-b2be-cd49a340ac21/tmp-sparse.vmdk to [datastore1] vmware_temp/31d0921e-7f23-4b18-a5c3-60a9191494cd/ca8542a2-3ba7-4624-b2be-cd49a340ac21/ca8542a2-3ba7-4624-b2be-cd49a340ac21.vmdk {{(pid=59534) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 737.517347] env[59534]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-9fe29834-e746-433c-9b1e-a5832744db9e {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 737.526191] env[59534]: DEBUG oslo_vmware.api [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Waiting for the task: (returnval){ [ 737.526191] env[59534]: value = "task-1308573" [ 737.526191] env[59534]: _type = "Task" [ 737.526191] env[59534]: } to complete. {{(pid=59534) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 737.537683] env[59534]: DEBUG oslo_vmware.api [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Task: {'id': task-1308573, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59534) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 737.934196] env[59534]: ERROR nova.compute.manager [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 19c303ea-3949-49ef-b1da-9130f02425b2, please check neutron logs for more information. [ 737.934196] env[59534]: ERROR nova.compute.manager Traceback (most recent call last): [ 737.934196] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 737.934196] env[59534]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 737.934196] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 737.934196] env[59534]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 737.934196] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 737.934196] env[59534]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 737.934196] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 737.934196] env[59534]: ERROR nova.compute.manager self.force_reraise() [ 737.934196] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 737.934196] env[59534]: ERROR nova.compute.manager raise self.value [ 737.934196] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 737.934196] env[59534]: ERROR nova.compute.manager updated_port = self._update_port( [ 737.934196] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 737.934196] env[59534]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 737.934845] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 737.934845] env[59534]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 737.934845] env[59534]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 19c303ea-3949-49ef-b1da-9130f02425b2, please check neutron logs for more information. [ 737.934845] env[59534]: ERROR nova.compute.manager [ 737.934845] env[59534]: Traceback (most recent call last): [ 737.934845] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 737.934845] env[59534]: listener.cb(fileno) [ 737.934845] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 737.934845] env[59534]: result = function(*args, **kwargs) [ 737.934845] env[59534]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 737.934845] env[59534]: return func(*args, **kwargs) [ 737.934845] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 737.934845] env[59534]: raise e [ 737.934845] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 737.934845] env[59534]: nwinfo = self.network_api.allocate_for_instance( [ 737.934845] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 737.934845] env[59534]: created_port_ids = self._update_ports_for_instance( [ 737.934845] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 737.934845] env[59534]: with excutils.save_and_reraise_exception(): [ 737.934845] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 737.934845] env[59534]: self.force_reraise() [ 737.934845] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 737.934845] env[59534]: raise self.value [ 737.934845] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 737.934845] env[59534]: updated_port = self._update_port( [ 737.934845] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 737.934845] env[59534]: _ensure_no_port_binding_failure(port) [ 737.934845] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 737.934845] env[59534]: raise exception.PortBindingFailed(port_id=port['id']) [ 737.935806] env[59534]: nova.exception.PortBindingFailed: Binding failed for port 19c303ea-3949-49ef-b1da-9130f02425b2, please check neutron logs for more information. [ 737.935806] env[59534]: Removing descriptor: 17 [ 737.935806] env[59534]: ERROR nova.compute.manager [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 19c303ea-3949-49ef-b1da-9130f02425b2, please check neutron logs for more information. [ 737.935806] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] Traceback (most recent call last): [ 737.935806] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 737.935806] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] yield resources [ 737.935806] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 737.935806] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] self.driver.spawn(context, instance, image_meta, [ 737.935806] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 737.935806] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] self._vmops.spawn(context, instance, image_meta, injected_files, [ 737.935806] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 737.935806] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] vm_ref = self.build_virtual_machine(instance, [ 737.936161] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 737.936161] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] vif_infos = vmwarevif.get_vif_info(self._session, [ 737.936161] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 737.936161] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] for vif in network_info: [ 737.936161] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 737.936161] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] return self._sync_wrapper(fn, *args, **kwargs) [ 737.936161] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 737.936161] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] self.wait() [ 737.936161] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 737.936161] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] self[:] = self._gt.wait() [ 737.936161] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 737.936161] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] return self._exit_event.wait() [ 737.936161] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 737.936161] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] result = hub.switch() [ 737.936535] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 737.936535] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] return self.greenlet.switch() [ 737.936535] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 737.936535] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] result = function(*args, **kwargs) [ 737.936535] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 737.936535] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] return func(*args, **kwargs) [ 737.936535] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 737.936535] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] raise e [ 737.936535] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 737.936535] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] nwinfo = self.network_api.allocate_for_instance( [ 737.936535] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 737.936535] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] created_port_ids = self._update_ports_for_instance( [ 737.936535] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 737.936877] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] with excutils.save_and_reraise_exception(): [ 737.936877] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 737.936877] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] self.force_reraise() [ 737.936877] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 737.936877] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] raise self.value [ 737.936877] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 737.936877] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] updated_port = self._update_port( [ 737.936877] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 737.936877] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] _ensure_no_port_binding_failure(port) [ 737.936877] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 737.936877] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] raise exception.PortBindingFailed(port_id=port['id']) [ 737.936877] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] nova.exception.PortBindingFailed: Binding failed for port 19c303ea-3949-49ef-b1da-9130f02425b2, please check neutron logs for more information. [ 737.936877] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] [ 737.937419] env[59534]: INFO nova.compute.manager [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] Terminating instance [ 737.939784] env[59534]: DEBUG oslo_concurrency.lockutils [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] Acquiring lock "refresh_cache-cbd11b11-3927-4f2f-81c5-9e282f432273" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 737.939784] env[59534]: DEBUG oslo_concurrency.lockutils [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] Acquired lock "refresh_cache-cbd11b11-3927-4f2f-81c5-9e282f432273" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 737.939784] env[59534]: DEBUG nova.network.neutron [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 738.022257] env[59534]: DEBUG nova.network.neutron [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 738.040888] env[59534]: DEBUG oslo_vmware.exceptions [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Fault InvalidArgument not matched. {{(pid=59534) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 738.041253] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Releasing lock "[datastore1] devstack-image-cache_base/ca8542a2-3ba7-4624-b2be-cd49a340ac21/ca8542a2-3ba7-4624-b2be-cd49a340ac21.vmdk" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 738.042549] env[59534]: ERROR nova.compute.manager [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 738.042549] env[59534]: Faults: ['InvalidArgument'] [ 738.042549] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] Traceback (most recent call last): [ 738.042549] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 738.042549] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] yield resources [ 738.042549] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 738.042549] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] self.driver.spawn(context, instance, image_meta, [ 738.042549] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 738.042549] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] self._vmops.spawn(context, instance, image_meta, injected_files, [ 738.042549] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 738.042549] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] self._fetch_image_if_missing(context, vi) [ 738.042549] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 738.042917] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] image_cache(vi, tmp_image_ds_loc) [ 738.042917] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 738.042917] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] vm_util.copy_virtual_disk( [ 738.042917] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 738.042917] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] session._wait_for_task(vmdk_copy_task) [ 738.042917] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 738.042917] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] return self.wait_for_task(task_ref) [ 738.042917] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 738.042917] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] return evt.wait() [ 738.042917] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 738.042917] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] result = hub.switch() [ 738.042917] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 738.042917] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] return self.greenlet.switch() [ 738.043256] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 738.043256] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] self.f(*self.args, **self.kw) [ 738.043256] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 738.043256] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] raise exceptions.translate_fault(task_info.error) [ 738.043256] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 738.043256] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] Faults: ['InvalidArgument'] [ 738.043256] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] [ 738.043256] env[59534]: INFO nova.compute.manager [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] Terminating instance [ 738.046270] env[59534]: DEBUG oslo_concurrency.lockutils [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Acquired lock "[datastore1] devstack-image-cache_base/ca8542a2-3ba7-4624-b2be-cd49a340ac21/ca8542a2-3ba7-4624-b2be-cd49a340ac21.vmdk" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 738.046270] env[59534]: DEBUG nova.virt.vmwareapi.ds_util [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59534) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 738.046270] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Acquiring lock "refresh_cache-053d549e-b3d6-4498-9261-cfacaf8b43bd" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 738.046270] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Acquired lock "refresh_cache-053d549e-b3d6-4498-9261-cfacaf8b43bd" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 738.046455] env[59534]: DEBUG nova.network.neutron [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 738.046709] env[59534]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ac3e9bc0-8c36-4084-9e84-4c0878c9289d {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.057458] env[59534]: DEBUG nova.virt.vmwareapi.ds_util [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59534) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 738.057661] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59534) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 738.058707] env[59534]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d8d5f0eb-9f8b-4124-95a5-ef645b063bc3 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.064873] env[59534]: DEBUG oslo_vmware.api [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Waiting for the task: (returnval){ [ 738.064873] env[59534]: value = "session[529ab406-b1c9-3c06-ce96-9015eeabf2c3]528b257a-11ac-9853-8828-0a8e6d039031" [ 738.064873] env[59534]: _type = "Task" [ 738.064873] env[59534]: } to complete. {{(pid=59534) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 738.073421] env[59534]: DEBUG oslo_vmware.api [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Task: {'id': session[529ab406-b1c9-3c06-ce96-9015eeabf2c3]528b257a-11ac-9853-8828-0a8e6d039031, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59534) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 738.087282] env[59534]: DEBUG nova.network.neutron [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 738.220571] env[59534]: DEBUG nova.network.neutron [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 738.241393] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Releasing lock "refresh_cache-053d549e-b3d6-4498-9261-cfacaf8b43bd" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 738.241784] env[59534]: DEBUG nova.compute.manager [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] Start destroying the instance on the hypervisor. {{(pid=59534) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 738.241969] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] Destroying instance {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 738.243104] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19d3ce39-4027-4177-b1cc-0bb9b4c1c266 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.251678] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] Unregistering the VM {{(pid=59534) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 738.252388] env[59534]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-293de82e-d8fb-4778-9aaa-512c383fe934 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.282762] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] Unregistered the VM {{(pid=59534) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 738.285892] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] Deleting contents of the VM from datastore datastore1 {{(pid=59534) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 738.285892] env[59534]: DEBUG nova.virt.vmwareapi.ds_util [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Deleting the datastore file [datastore1] 053d549e-b3d6-4498-9261-cfacaf8b43bd {{(pid=59534) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 738.285892] env[59534]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-91b42d3a-0493-495b-aeac-9de38be120e9 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.292092] env[59534]: DEBUG oslo_vmware.api [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Waiting for the task: (returnval){ [ 738.292092] env[59534]: value = "task-1308575" [ 738.292092] env[59534]: _type = "Task" [ 738.292092] env[59534]: } to complete. {{(pid=59534) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 738.303332] env[59534]: DEBUG oslo_vmware.api [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Task: {'id': task-1308575, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59534) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 738.524411] env[59534]: DEBUG oslo_concurrency.lockutils [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Acquiring lock "953994f5-677e-40a8-8008-074630080334" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 738.525661] env[59534]: DEBUG oslo_concurrency.lockutils [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Lock "953994f5-677e-40a8-8008-074630080334" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 738.535452] env[59534]: DEBUG nova.compute.manager [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 953994f5-677e-40a8-8008-074630080334] Starting instance... {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 738.578211] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] Preparing fetch location {{(pid=59534) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 738.578923] env[59534]: DEBUG nova.virt.vmwareapi.ds_util [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Creating directory with path [datastore1] vmware_temp/83f48531-31c8-42bf-8615-4b9db46169d3/ca8542a2-3ba7-4624-b2be-cd49a340ac21 {{(pid=59534) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 738.579447] env[59534]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-053aa4d9-9a91-450d-b2d5-1b0aaffedba0 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.595823] env[59534]: DEBUG nova.virt.vmwareapi.ds_util [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Created directory with path [datastore1] vmware_temp/83f48531-31c8-42bf-8615-4b9db46169d3/ca8542a2-3ba7-4624-b2be-cd49a340ac21 {{(pid=59534) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 738.596109] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] Fetch image to [datastore1] vmware_temp/83f48531-31c8-42bf-8615-4b9db46169d3/ca8542a2-3ba7-4624-b2be-cd49a340ac21/tmp-sparse.vmdk {{(pid=59534) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 738.596109] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] Downloading image file data ca8542a2-3ba7-4624-b2be-cd49a340ac21 to [datastore1] vmware_temp/83f48531-31c8-42bf-8615-4b9db46169d3/ca8542a2-3ba7-4624-b2be-cd49a340ac21/tmp-sparse.vmdk on the data store datastore1 {{(pid=59534) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 738.597358] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea6b25bb-9866-4711-9a2c-2de9cd7f0bca {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.604983] env[59534]: DEBUG oslo_concurrency.lockutils [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 738.604983] env[59534]: DEBUG oslo_concurrency.lockutils [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 738.606380] env[59534]: INFO nova.compute.claims [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 953994f5-677e-40a8-8008-074630080334] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 738.611268] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-110f4c32-0538-49c3-b1c1-e1a8def4e8a0 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.621858] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20e75884-c572-4ed2-8759-8a16c92209ff {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.658838] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35c8cfe7-26e5-4f95-a2c8-48e7a371f6b5 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.668035] env[59534]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-cb6854bb-9dd9-4290-92ab-680d889a8cc6 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.717680] env[59534]: DEBUG nova.network.neutron [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 738.737781] env[59534]: DEBUG oslo_concurrency.lockutils [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] Releasing lock "refresh_cache-cbd11b11-3927-4f2f-81c5-9e282f432273" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 738.738113] env[59534]: DEBUG nova.compute.manager [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] Start destroying the instance on the hypervisor. {{(pid=59534) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 738.738306] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] Destroying instance {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 738.738891] env[59534]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-1a7ca7ed-2294-4b0a-a16e-ef1795c994ec {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.751070] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9885841-b930-49dc-b07f-de4e8ad39511 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.768578] env[59534]: DEBUG nova.virt.vmwareapi.images [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] Downloading image file data ca8542a2-3ba7-4624-b2be-cd49a340ac21 to the data store datastore1 {{(pid=59534) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 738.787084] env[59534]: WARNING nova.virt.vmwareapi.vmops [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance cbd11b11-3927-4f2f-81c5-9e282f432273 could not be found. [ 738.787273] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] Instance destroyed {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 738.787584] env[59534]: INFO nova.compute.manager [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] Took 0.05 seconds to destroy the instance on the hypervisor. [ 738.787945] env[59534]: DEBUG oslo.service.loopingcall [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59534) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 738.788127] env[59534]: DEBUG nova.compute.manager [-] [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 738.788216] env[59534]: DEBUG nova.network.neutron [-] [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 738.803386] env[59534]: DEBUG oslo_vmware.api [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Task: {'id': task-1308575, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.043885} completed successfully. {{(pid=59534) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 738.807338] env[59534]: DEBUG nova.virt.vmwareapi.ds_util [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Deleted the datastore file {{(pid=59534) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 738.807925] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] Deleted contents of the VM from datastore datastore1 {{(pid=59534) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 738.807925] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] Instance destroyed {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 738.807925] env[59534]: INFO nova.compute.manager [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] Took 0.57 seconds to destroy the instance on the hypervisor. [ 738.808132] env[59534]: DEBUG oslo.service.loopingcall [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59534) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 738.811396] env[59534]: DEBUG nova.compute.manager [-] [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] Skipping network deallocation for instance since networking was not requested. {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 738.812643] env[59534]: DEBUG nova.compute.claims [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] Aborting claim: {{(pid=59534) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 738.812899] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 738.833905] env[59534]: DEBUG nova.network.neutron [-] [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 738.848556] env[59534]: DEBUG nova.compute.manager [req-39e93f12-a6f0-4325-aabd-843af378960f req-fa577665-fa15-4538-a6cb-1a07ffc198f1 service nova] [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] Received event network-changed-19c303ea-3949-49ef-b1da-9130f02425b2 {{(pid=59534) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 738.848769] env[59534]: DEBUG nova.compute.manager [req-39e93f12-a6f0-4325-aabd-843af378960f req-fa577665-fa15-4538-a6cb-1a07ffc198f1 service nova] [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] Refreshing instance network info cache due to event network-changed-19c303ea-3949-49ef-b1da-9130f02425b2. {{(pid=59534) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 738.848933] env[59534]: DEBUG oslo_concurrency.lockutils [req-39e93f12-a6f0-4325-aabd-843af378960f req-fa577665-fa15-4538-a6cb-1a07ffc198f1 service nova] Acquiring lock "refresh_cache-cbd11b11-3927-4f2f-81c5-9e282f432273" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 738.849254] env[59534]: DEBUG oslo_concurrency.lockutils [req-39e93f12-a6f0-4325-aabd-843af378960f req-fa577665-fa15-4538-a6cb-1a07ffc198f1 service nova] Acquired lock "refresh_cache-cbd11b11-3927-4f2f-81c5-9e282f432273" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 738.849254] env[59534]: DEBUG nova.network.neutron [req-39e93f12-a6f0-4325-aabd-843af378960f req-fa577665-fa15-4538-a6cb-1a07ffc198f1 service nova] [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] Refreshing network info cache for port 19c303ea-3949-49ef-b1da-9130f02425b2 {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 738.853958] env[59534]: DEBUG nova.network.neutron [-] [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 738.860808] env[59534]: DEBUG oslo_vmware.rw_handles [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/83f48531-31c8-42bf-8615-4b9db46169d3/ca8542a2-3ba7-4624-b2be-cd49a340ac21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59534) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 738.920287] env[59534]: INFO nova.compute.manager [-] [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] Took 0.13 seconds to deallocate network for instance. [ 738.925539] env[59534]: DEBUG oslo_vmware.rw_handles [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Completed reading data from the image iterator. {{(pid=59534) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 738.925539] env[59534]: DEBUG oslo_vmware.rw_handles [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/83f48531-31c8-42bf-8615-4b9db46169d3/ca8542a2-3ba7-4624-b2be-cd49a340ac21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59534) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 738.926943] env[59534]: DEBUG nova.compute.claims [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] Aborting claim: {{(pid=59534) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 738.926943] env[59534]: DEBUG oslo_concurrency.lockutils [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 738.937446] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83a10b3a-9c15-4ba8-8c0d-6c3ca1fe0947 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.946010] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09f45e16-6a73-4bf4-ab57-5355d51db99b {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.976570] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-479d7da5-3d5d-42d8-a8a1-f81a572f61b4 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.979881] env[59534]: DEBUG nova.network.neutron [req-39e93f12-a6f0-4325-aabd-843af378960f req-fa577665-fa15-4538-a6cb-1a07ffc198f1 service nova] [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 738.987114] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81437cbf-3e3c-4eae-9159-aa730c7cc4e2 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.001035] env[59534]: DEBUG nova.compute.provider_tree [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 739.009995] env[59534]: DEBUG nova.scheduler.client.report [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 739.029031] env[59534]: DEBUG oslo_concurrency.lockutils [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.423s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 739.029031] env[59534]: DEBUG nova.compute.manager [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 953994f5-677e-40a8-8008-074630080334] Start building networks asynchronously for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 739.031115] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.218s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 739.087265] env[59534]: DEBUG nova.compute.utils [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Using /dev/sd instead of None {{(pid=59534) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 739.089276] env[59534]: DEBUG nova.compute.manager [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 953994f5-677e-40a8-8008-074630080334] Allocating IP information in the background. {{(pid=59534) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 739.089276] env[59534]: DEBUG nova.network.neutron [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 953994f5-677e-40a8-8008-074630080334] allocate_for_instance() {{(pid=59534) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 739.099813] env[59534]: DEBUG nova.compute.manager [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 953994f5-677e-40a8-8008-074630080334] Start building block device mappings for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 739.192364] env[59534]: DEBUG nova.compute.manager [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 953994f5-677e-40a8-8008-074630080334] Start spawning the instance on the hypervisor. {{(pid=59534) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 739.229738] env[59534]: DEBUG nova.virt.hardware [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T19:44:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T19:44:03Z,direct_url=,disk_format='vmdk',id=ca8542a2-3ba7-4624-b2be-cd49a340ac21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='31a329da9a7b4c98a2734a8492bf7dec',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T19:44:04Z,virtual_size=,visibility=), allow threads: False {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 739.229988] env[59534]: DEBUG nova.virt.hardware [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Flavor limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 739.230155] env[59534]: DEBUG nova.virt.hardware [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Image limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 739.230334] env[59534]: DEBUG nova.virt.hardware [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Flavor pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 739.230755] env[59534]: DEBUG nova.virt.hardware [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Image pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 739.230755] env[59534]: DEBUG nova.virt.hardware [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 739.230852] env[59534]: DEBUG nova.virt.hardware [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 739.230956] env[59534]: DEBUG nova.virt.hardware [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 739.231123] env[59534]: DEBUG nova.virt.hardware [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Got 1 possible topologies {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 739.231284] env[59534]: DEBUG nova.virt.hardware [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 739.231452] env[59534]: DEBUG nova.virt.hardware [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 739.232531] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31cdd292-f356-41e9-aaf4-e06a21620c4c {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.241505] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-803f2a8a-4838-48b0-a160-25db320fc4ac {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.246832] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aed92b00-59bf-4660-a0b6-4f7e73cc696f {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.264565] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3148cdee-c3a1-4cef-8b0a-540457b1a391 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.294969] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22ffead9-0bb8-4968-82fa-026a949fc08d {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.302939] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ce80341-1f14-4283-ab6c-1425e5dea17b {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.317987] env[59534]: DEBUG nova.compute.provider_tree [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 739.322710] env[59534]: DEBUG nova.policy [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f1de811c49e4475e8bebd9420cb053e3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f07a595f0d54471b9b09e9b1b9b0b5a5', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59534) authorize /opt/stack/nova/nova/policy.py:203}} [ 739.329839] env[59534]: DEBUG nova.scheduler.client.report [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 739.350308] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.319s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 739.352823] env[59534]: ERROR nova.compute.manager [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 739.352823] env[59534]: Faults: ['InvalidArgument'] [ 739.352823] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] Traceback (most recent call last): [ 739.352823] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 739.352823] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] self.driver.spawn(context, instance, image_meta, [ 739.352823] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 739.352823] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] self._vmops.spawn(context, instance, image_meta, injected_files, [ 739.352823] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 739.352823] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] self._fetch_image_if_missing(context, vi) [ 739.352823] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 739.352823] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] image_cache(vi, tmp_image_ds_loc) [ 739.352823] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 739.353251] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] vm_util.copy_virtual_disk( [ 739.353251] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 739.353251] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] session._wait_for_task(vmdk_copy_task) [ 739.353251] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 739.353251] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] return self.wait_for_task(task_ref) [ 739.353251] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 739.353251] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] return evt.wait() [ 739.353251] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 739.353251] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] result = hub.switch() [ 739.353251] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 739.353251] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] return self.greenlet.switch() [ 739.353251] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 739.353251] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] self.f(*self.args, **self.kw) [ 739.353670] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 739.353670] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] raise exceptions.translate_fault(task_info.error) [ 739.353670] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 739.353670] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] Faults: ['InvalidArgument'] [ 739.353670] env[59534]: ERROR nova.compute.manager [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] [ 739.353670] env[59534]: DEBUG nova.compute.utils [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] VimFaultException {{(pid=59534) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 739.356442] env[59534]: DEBUG nova.compute.manager [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] Build of instance 053d549e-b3d6-4498-9261-cfacaf8b43bd was re-scheduled: A specified parameter was not correct: fileType [ 739.356442] env[59534]: Faults: ['InvalidArgument'] {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 739.356722] env[59534]: DEBUG nova.compute.manager [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] Unplugging VIFs for instance {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 739.358784] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Acquiring lock "refresh_cache-053d549e-b3d6-4498-9261-cfacaf8b43bd" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 739.358784] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Acquired lock "refresh_cache-053d549e-b3d6-4498-9261-cfacaf8b43bd" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 739.358784] env[59534]: DEBUG nova.network.neutron [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 739.358784] env[59534]: DEBUG oslo_concurrency.lockutils [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.431s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 739.394890] env[59534]: DEBUG nova.network.neutron [req-39e93f12-a6f0-4325-aabd-843af378960f req-fa577665-fa15-4538-a6cb-1a07ffc198f1 service nova] [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 739.410924] env[59534]: DEBUG oslo_concurrency.lockutils [req-39e93f12-a6f0-4325-aabd-843af378960f req-fa577665-fa15-4538-a6cb-1a07ffc198f1 service nova] Releasing lock "refresh_cache-cbd11b11-3927-4f2f-81c5-9e282f432273" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 739.433790] env[59534]: DEBUG nova.network.neutron [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 739.556182] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9abfb285-363d-4176-93d3-663e7ce9fcc8 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.562976] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5345b34-a139-4c3e-864d-0cdb6466e39e {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.594347] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6ed94b2-1ace-4fcc-8942-db5e50a281d8 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.602757] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9c757be-c064-4ff0-bc1a-286164f348ab {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.617996] env[59534]: DEBUG nova.compute.provider_tree [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 739.627923] env[59534]: DEBUG nova.scheduler.client.report [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 739.646373] env[59534]: DEBUG oslo_concurrency.lockutils [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.288s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 739.646771] env[59534]: ERROR nova.compute.manager [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 19c303ea-3949-49ef-b1da-9130f02425b2, please check neutron logs for more information. [ 739.646771] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] Traceback (most recent call last): [ 739.646771] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 739.646771] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] self.driver.spawn(context, instance, image_meta, [ 739.646771] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 739.646771] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] self._vmops.spawn(context, instance, image_meta, injected_files, [ 739.646771] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 739.646771] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] vm_ref = self.build_virtual_machine(instance, [ 739.646771] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 739.646771] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] vif_infos = vmwarevif.get_vif_info(self._session, [ 739.646771] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 739.647113] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] for vif in network_info: [ 739.647113] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 739.647113] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] return self._sync_wrapper(fn, *args, **kwargs) [ 739.647113] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 739.647113] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] self.wait() [ 739.647113] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 739.647113] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] self[:] = self._gt.wait() [ 739.647113] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 739.647113] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] return self._exit_event.wait() [ 739.647113] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 739.647113] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] result = hub.switch() [ 739.647113] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 739.647113] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] return self.greenlet.switch() [ 739.647113] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 739.647538] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] result = function(*args, **kwargs) [ 739.647538] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 739.647538] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] return func(*args, **kwargs) [ 739.647538] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 739.647538] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] raise e [ 739.647538] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 739.647538] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] nwinfo = self.network_api.allocate_for_instance( [ 739.647538] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 739.647538] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] created_port_ids = self._update_ports_for_instance( [ 739.647538] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 739.647538] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] with excutils.save_and_reraise_exception(): [ 739.647538] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 739.647538] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] self.force_reraise() [ 739.647904] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 739.647904] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] raise self.value [ 739.647904] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 739.647904] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] updated_port = self._update_port( [ 739.647904] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 739.647904] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] _ensure_no_port_binding_failure(port) [ 739.647904] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 739.647904] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] raise exception.PortBindingFailed(port_id=port['id']) [ 739.647904] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] nova.exception.PortBindingFailed: Binding failed for port 19c303ea-3949-49ef-b1da-9130f02425b2, please check neutron logs for more information. [ 739.647904] env[59534]: ERROR nova.compute.manager [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] [ 739.647904] env[59534]: DEBUG nova.compute.utils [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] Binding failed for port 19c303ea-3949-49ef-b1da-9130f02425b2, please check neutron logs for more information. {{(pid=59534) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 739.649241] env[59534]: DEBUG nova.compute.manager [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] Build of instance cbd11b11-3927-4f2f-81c5-9e282f432273 was re-scheduled: Binding failed for port 19c303ea-3949-49ef-b1da-9130f02425b2, please check neutron logs for more information. {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 739.649674] env[59534]: DEBUG nova.compute.manager [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] Unplugging VIFs for instance {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 739.649900] env[59534]: DEBUG oslo_concurrency.lockutils [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] Acquiring lock "refresh_cache-cbd11b11-3927-4f2f-81c5-9e282f432273" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 739.650086] env[59534]: DEBUG oslo_concurrency.lockutils [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] Acquired lock "refresh_cache-cbd11b11-3927-4f2f-81c5-9e282f432273" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 739.650299] env[59534]: DEBUG nova.network.neutron [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 739.726188] env[59534]: DEBUG nova.network.neutron [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 739.886964] env[59534]: DEBUG nova.network.neutron [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 739.896492] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Releasing lock "refresh_cache-053d549e-b3d6-4498-9261-cfacaf8b43bd" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 739.896729] env[59534]: DEBUG nova.compute.manager [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 739.896905] env[59534]: DEBUG nova.compute.manager [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] [instance: 053d549e-b3d6-4498-9261-cfacaf8b43bd] Skipping network deallocation for instance since networking was not requested. {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 739.986528] env[59534]: DEBUG oslo_service.periodic_task [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59534) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 739.986973] env[59534]: DEBUG oslo_service.periodic_task [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59534) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 739.986973] env[59534]: DEBUG oslo_service.periodic_task [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59534) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 739.991836] env[59534]: INFO nova.scheduler.client.report [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Deleted allocations for instance 053d549e-b3d6-4498-9261-cfacaf8b43bd [ 740.024916] env[59534]: DEBUG oslo_concurrency.lockutils [None req-f777dab6-5220-427c-ad83-c515f417aa1e tempest-ServerDiagnosticsV248Test-1888611138 tempest-ServerDiagnosticsV248Test-1888611138-project-member] Lock "053d549e-b3d6-4498-9261-cfacaf8b43bd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 50.892s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 740.398624] env[59534]: DEBUG nova.network.neutron [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 740.411590] env[59534]: DEBUG oslo_concurrency.lockutils [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] Releasing lock "refresh_cache-cbd11b11-3927-4f2f-81c5-9e282f432273" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 740.411819] env[59534]: DEBUG nova.compute.manager [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 740.412056] env[59534]: DEBUG nova.compute.manager [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 740.412177] env[59534]: DEBUG nova.network.neutron [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 740.505973] env[59534]: DEBUG nova.network.neutron [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 953994f5-677e-40a8-8008-074630080334] Successfully created port: 900fb54a-0dce-428f-8c20-ddc4a7315725 {{(pid=59534) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 740.507866] env[59534]: DEBUG nova.network.neutron [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 740.517309] env[59534]: DEBUG nova.network.neutron [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 740.528581] env[59534]: INFO nova.compute.manager [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] [instance: cbd11b11-3927-4f2f-81c5-9e282f432273] Took 0.12 seconds to deallocate network for instance. [ 740.553136] env[59534]: ERROR nova.compute.manager [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 364b7b07-d7bf-4503-b03c-8a3c16403c51, please check neutron logs for more information. [ 740.553136] env[59534]: ERROR nova.compute.manager Traceback (most recent call last): [ 740.553136] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 740.553136] env[59534]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 740.553136] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 740.553136] env[59534]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 740.553136] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 740.553136] env[59534]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 740.553136] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 740.553136] env[59534]: ERROR nova.compute.manager self.force_reraise() [ 740.553136] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 740.553136] env[59534]: ERROR nova.compute.manager raise self.value [ 740.553136] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 740.553136] env[59534]: ERROR nova.compute.manager updated_port = self._update_port( [ 740.553136] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 740.553136] env[59534]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 740.553561] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 740.553561] env[59534]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 740.553561] env[59534]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 364b7b07-d7bf-4503-b03c-8a3c16403c51, please check neutron logs for more information. [ 740.553561] env[59534]: ERROR nova.compute.manager [ 740.553561] env[59534]: Traceback (most recent call last): [ 740.553561] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 740.553561] env[59534]: listener.cb(fileno) [ 740.553561] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 740.553561] env[59534]: result = function(*args, **kwargs) [ 740.553561] env[59534]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 740.553561] env[59534]: return func(*args, **kwargs) [ 740.553561] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 740.553561] env[59534]: raise e [ 740.553561] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 740.553561] env[59534]: nwinfo = self.network_api.allocate_for_instance( [ 740.553561] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 740.553561] env[59534]: created_port_ids = self._update_ports_for_instance( [ 740.553561] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 740.553561] env[59534]: with excutils.save_and_reraise_exception(): [ 740.553561] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 740.553561] env[59534]: self.force_reraise() [ 740.553561] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 740.553561] env[59534]: raise self.value [ 740.553561] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 740.553561] env[59534]: updated_port = self._update_port( [ 740.553561] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 740.553561] env[59534]: _ensure_no_port_binding_failure(port) [ 740.553561] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 740.553561] env[59534]: raise exception.PortBindingFailed(port_id=port['id']) [ 740.554417] env[59534]: nova.exception.PortBindingFailed: Binding failed for port 364b7b07-d7bf-4503-b03c-8a3c16403c51, please check neutron logs for more information. [ 740.554417] env[59534]: Removing descriptor: 21 [ 740.554417] env[59534]: ERROR nova.compute.manager [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] [instance: 57165716-986f-4582-ae73-e60fde250240] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 364b7b07-d7bf-4503-b03c-8a3c16403c51, please check neutron logs for more information. [ 740.554417] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] Traceback (most recent call last): [ 740.554417] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 740.554417] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] yield resources [ 740.554417] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 740.554417] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] self.driver.spawn(context, instance, image_meta, [ 740.554417] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 740.554417] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] self._vmops.spawn(context, instance, image_meta, injected_files, [ 740.554417] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 740.554417] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] vm_ref = self.build_virtual_machine(instance, [ 740.554733] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 740.554733] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] vif_infos = vmwarevif.get_vif_info(self._session, [ 740.554733] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 740.554733] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] for vif in network_info: [ 740.554733] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 740.554733] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] return self._sync_wrapper(fn, *args, **kwargs) [ 740.554733] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 740.554733] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] self.wait() [ 740.554733] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 740.554733] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] self[:] = self._gt.wait() [ 740.554733] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 740.554733] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] return self._exit_event.wait() [ 740.554733] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 740.554733] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] result = hub.switch() [ 740.555106] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 740.555106] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] return self.greenlet.switch() [ 740.555106] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 740.555106] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] result = function(*args, **kwargs) [ 740.555106] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 740.555106] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] return func(*args, **kwargs) [ 740.555106] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 740.555106] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] raise e [ 740.555106] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 740.555106] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] nwinfo = self.network_api.allocate_for_instance( [ 740.555106] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 740.555106] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] created_port_ids = self._update_ports_for_instance( [ 740.555106] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 740.555453] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] with excutils.save_and_reraise_exception(): [ 740.555453] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 740.555453] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] self.force_reraise() [ 740.555453] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 740.555453] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] raise self.value [ 740.555453] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 740.555453] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] updated_port = self._update_port( [ 740.555453] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 740.555453] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] _ensure_no_port_binding_failure(port) [ 740.555453] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 740.555453] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] raise exception.PortBindingFailed(port_id=port['id']) [ 740.555453] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] nova.exception.PortBindingFailed: Binding failed for port 364b7b07-d7bf-4503-b03c-8a3c16403c51, please check neutron logs for more information. [ 740.555453] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] [ 740.555778] env[59534]: INFO nova.compute.manager [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] [instance: 57165716-986f-4582-ae73-e60fde250240] Terminating instance [ 740.559149] env[59534]: DEBUG oslo_concurrency.lockutils [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] Acquiring lock "refresh_cache-57165716-986f-4582-ae73-e60fde250240" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 740.559320] env[59534]: DEBUG oslo_concurrency.lockutils [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] Acquired lock "refresh_cache-57165716-986f-4582-ae73-e60fde250240" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 740.559485] env[59534]: DEBUG nova.network.neutron [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] [instance: 57165716-986f-4582-ae73-e60fde250240] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 740.640928] env[59534]: DEBUG nova.network.neutron [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] [instance: 57165716-986f-4582-ae73-e60fde250240] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 740.671944] env[59534]: INFO nova.scheduler.client.report [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] Deleted allocations for instance cbd11b11-3927-4f2f-81c5-9e282f432273 [ 740.683175] env[59534]: DEBUG oslo_service.periodic_task [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59534) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 740.687349] env[59534]: DEBUG oslo_service.periodic_task [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59534) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 740.704783] env[59534]: DEBUG oslo_concurrency.lockutils [None req-0cda8a7b-1695-4437-96ca-ce5bb6f5de23 tempest-ServerPasswordTestJSON-698716183 tempest-ServerPasswordTestJSON-698716183-project-member] Lock "cbd11b11-3927-4f2f-81c5-9e282f432273" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 13.760s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 740.712030] env[59534]: DEBUG oslo_concurrency.lockutils [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 740.713257] env[59534]: DEBUG oslo_concurrency.lockutils [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 740.713257] env[59534]: DEBUG oslo_concurrency.lockutils [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 740.713640] env[59534]: DEBUG nova.compute.resource_tracker [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59534) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 740.715163] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c648022f-3576-4b73-b84f-0302c09fbb00 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 740.726294] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc9a6a28-9dc6-4920-ace7-c3f516fd9ad9 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 740.748071] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b5103b2-765b-46a9-9bb0-2916881a58d9 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 740.756249] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b510e49-9999-4290-b2bf-d9445e8697c3 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 740.789771] env[59534]: DEBUG nova.compute.resource_tracker [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181494MB free_disk=136GB free_vcpus=48 pci_devices=None {{(pid=59534) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 740.789939] env[59534]: DEBUG oslo_concurrency.lockutils [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 740.790162] env[59534]: DEBUG oslo_concurrency.lockutils [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 740.855967] env[59534]: DEBUG nova.compute.resource_tracker [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Instance 5a549ffd-3cc3-4723-bfe6-510dbef0fea7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59534) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 740.856142] env[59534]: DEBUG nova.compute.resource_tracker [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Instance 1d6fb105-7087-4bdf-9b1c-b194baf39a55 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59534) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 740.856268] env[59534]: DEBUG nova.compute.resource_tracker [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Instance 57165716-986f-4582-ae73-e60fde250240 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59534) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 740.856411] env[59534]: DEBUG nova.compute.resource_tracker [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Instance 192ab790-d9db-4990-93d0-24603bd65016 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59534) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 740.856527] env[59534]: DEBUG nova.compute.resource_tracker [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Instance 744abb9c-379e-477b-a493-210cde36c314 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59534) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 740.856642] env[59534]: DEBUG nova.compute.resource_tracker [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Instance 953994f5-677e-40a8-8008-074630080334 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59534) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 740.885761] env[59534]: DEBUG nova.compute.resource_tracker [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Instance 3b30863b-8dc5-43e1-a222-ddcc6945af5f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59534) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 740.885992] env[59534]: DEBUG nova.compute.resource_tracker [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Total usable vcpus: 48, total allocated vcpus: 6 {{(pid=59534) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 740.886150] env[59534]: DEBUG nova.compute.resource_tracker [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1280MB phys_disk=200GB used_disk=6GB total_vcpus=48 used_vcpus=6 pci_stats=[] {{(pid=59534) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 740.902894] env[59534]: DEBUG oslo_concurrency.lockutils [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Acquiring lock "3b30863b-8dc5-43e1-a222-ddcc6945af5f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 740.902894] env[59534]: DEBUG oslo_concurrency.lockutils [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Lock "3b30863b-8dc5-43e1-a222-ddcc6945af5f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 740.916227] env[59534]: DEBUG nova.compute.manager [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] Starting instance... {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 740.972881] env[59534]: DEBUG oslo_concurrency.lockutils [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 741.014008] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5fde9b2-c6a8-43ce-9037-bf64fab9af5c {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 741.022925] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d4f2f12-185a-41c3-bd52-7d4a050855ee {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 741.060073] env[59534]: DEBUG nova.network.neutron [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] [instance: 57165716-986f-4582-ae73-e60fde250240] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 741.061975] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88662d02-b7cf-4ce4-b4e4-3bbcfb979f60 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 741.070518] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa876581-9444-42ba-b757-f9679309fbfe {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 741.079343] env[59534]: DEBUG oslo_concurrency.lockutils [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] Releasing lock "refresh_cache-57165716-986f-4582-ae73-e60fde250240" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 741.079836] env[59534]: DEBUG nova.compute.manager [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] [instance: 57165716-986f-4582-ae73-e60fde250240] Start destroying the instance on the hypervisor. {{(pid=59534) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 741.079961] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] [instance: 57165716-986f-4582-ae73-e60fde250240] Destroying instance {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 741.088541] env[59534]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-a7753f02-b889-45ae-ab62-9e6eb7f30823 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 741.090698] env[59534]: DEBUG nova.compute.provider_tree [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 741.098982] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c8c0058-2124-424e-a292-4609b01de7d3 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 741.112668] env[59534]: DEBUG nova.scheduler.client.report [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 741.128681] env[59534]: WARNING nova.virt.vmwareapi.vmops [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] [instance: 57165716-986f-4582-ae73-e60fde250240] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 57165716-986f-4582-ae73-e60fde250240 could not be found. [ 741.129068] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] [instance: 57165716-986f-4582-ae73-e60fde250240] Instance destroyed {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 741.129131] env[59534]: INFO nova.compute.manager [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] [instance: 57165716-986f-4582-ae73-e60fde250240] Took 0.05 seconds to destroy the instance on the hypervisor. [ 741.129334] env[59534]: DEBUG oslo.service.loopingcall [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59534) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 741.129551] env[59534]: DEBUG nova.compute.manager [-] [instance: 57165716-986f-4582-ae73-e60fde250240] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 741.129645] env[59534]: DEBUG nova.network.neutron [-] [instance: 57165716-986f-4582-ae73-e60fde250240] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 741.132279] env[59534]: DEBUG nova.compute.resource_tracker [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59534) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 741.132445] env[59534]: DEBUG oslo_concurrency.lockutils [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.342s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 741.133030] env[59534]: DEBUG oslo_concurrency.lockutils [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.160s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 741.134454] env[59534]: INFO nova.compute.claims [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 741.296685] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff1fe0f4-8506-4569-b2aa-2d2316e8ce95 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 741.304503] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7680b99f-0ea9-4317-99df-db54d8f5b773 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 741.335873] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0fe9b4a1-fb96-4fe3-b629-34522f275919 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 741.343682] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92c3f9bd-69ad-471a-8e80-0d1944197937 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 741.359293] env[59534]: DEBUG nova.compute.provider_tree [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 741.368023] env[59534]: DEBUG nova.scheduler.client.report [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 741.380955] env[59534]: DEBUG oslo_concurrency.lockutils [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.248s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 741.381441] env[59534]: DEBUG nova.compute.manager [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] Start building networks asynchronously for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 741.384254] env[59534]: DEBUG nova.network.neutron [-] [instance: 57165716-986f-4582-ae73-e60fde250240] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 741.391747] env[59534]: DEBUG nova.network.neutron [-] [instance: 57165716-986f-4582-ae73-e60fde250240] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 741.395148] env[59534]: ERROR nova.compute.manager [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 289d3b0f-d24b-44cd-8e43-fb69290023da, please check neutron logs for more information. [ 741.395148] env[59534]: ERROR nova.compute.manager Traceback (most recent call last): [ 741.395148] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 741.395148] env[59534]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 741.395148] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 741.395148] env[59534]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 741.395148] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 741.395148] env[59534]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 741.395148] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 741.395148] env[59534]: ERROR nova.compute.manager self.force_reraise() [ 741.395148] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 741.395148] env[59534]: ERROR nova.compute.manager raise self.value [ 741.395148] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 741.395148] env[59534]: ERROR nova.compute.manager updated_port = self._update_port( [ 741.395148] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 741.395148] env[59534]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 741.396112] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 741.396112] env[59534]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 741.396112] env[59534]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 289d3b0f-d24b-44cd-8e43-fb69290023da, please check neutron logs for more information. [ 741.396112] env[59534]: ERROR nova.compute.manager [ 741.396112] env[59534]: Traceback (most recent call last): [ 741.396112] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 741.396112] env[59534]: listener.cb(fileno) [ 741.396112] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 741.396112] env[59534]: result = function(*args, **kwargs) [ 741.396112] env[59534]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 741.396112] env[59534]: return func(*args, **kwargs) [ 741.396112] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 741.396112] env[59534]: raise e [ 741.396112] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 741.396112] env[59534]: nwinfo = self.network_api.allocate_for_instance( [ 741.396112] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 741.396112] env[59534]: created_port_ids = self._update_ports_for_instance( [ 741.396112] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 741.396112] env[59534]: with excutils.save_and_reraise_exception(): [ 741.396112] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 741.396112] env[59534]: self.force_reraise() [ 741.396112] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 741.396112] env[59534]: raise self.value [ 741.396112] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 741.396112] env[59534]: updated_port = self._update_port( [ 741.396112] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 741.396112] env[59534]: _ensure_no_port_binding_failure(port) [ 741.396112] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 741.396112] env[59534]: raise exception.PortBindingFailed(port_id=port['id']) [ 741.397279] env[59534]: nova.exception.PortBindingFailed: Binding failed for port 289d3b0f-d24b-44cd-8e43-fb69290023da, please check neutron logs for more information. [ 741.397279] env[59534]: Removing descriptor: 18 [ 741.397279] env[59534]: ERROR nova.compute.manager [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] [instance: 192ab790-d9db-4990-93d0-24603bd65016] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 289d3b0f-d24b-44cd-8e43-fb69290023da, please check neutron logs for more information. [ 741.397279] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] Traceback (most recent call last): [ 741.397279] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 741.397279] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] yield resources [ 741.397279] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 741.397279] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] self.driver.spawn(context, instance, image_meta, [ 741.397279] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 741.397279] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] self._vmops.spawn(context, instance, image_meta, injected_files, [ 741.397279] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 741.397279] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] vm_ref = self.build_virtual_machine(instance, [ 741.397817] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 741.397817] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] vif_infos = vmwarevif.get_vif_info(self._session, [ 741.397817] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 741.397817] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] for vif in network_info: [ 741.397817] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 741.397817] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] return self._sync_wrapper(fn, *args, **kwargs) [ 741.397817] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 741.397817] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] self.wait() [ 741.397817] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 741.397817] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] self[:] = self._gt.wait() [ 741.397817] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 741.397817] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] return self._exit_event.wait() [ 741.397817] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 741.397817] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] result = hub.switch() [ 741.398406] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 741.398406] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] return self.greenlet.switch() [ 741.398406] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 741.398406] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] result = function(*args, **kwargs) [ 741.398406] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 741.398406] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] return func(*args, **kwargs) [ 741.398406] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 741.398406] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] raise e [ 741.398406] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 741.398406] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] nwinfo = self.network_api.allocate_for_instance( [ 741.398406] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 741.398406] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] created_port_ids = self._update_ports_for_instance( [ 741.398406] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 741.398959] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] with excutils.save_and_reraise_exception(): [ 741.398959] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 741.398959] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] self.force_reraise() [ 741.398959] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 741.398959] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] raise self.value [ 741.398959] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 741.398959] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] updated_port = self._update_port( [ 741.398959] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 741.398959] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] _ensure_no_port_binding_failure(port) [ 741.398959] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 741.398959] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] raise exception.PortBindingFailed(port_id=port['id']) [ 741.398959] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] nova.exception.PortBindingFailed: Binding failed for port 289d3b0f-d24b-44cd-8e43-fb69290023da, please check neutron logs for more information. [ 741.398959] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] [ 741.399678] env[59534]: INFO nova.compute.manager [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] [instance: 192ab790-d9db-4990-93d0-24603bd65016] Terminating instance [ 741.399678] env[59534]: DEBUG oslo_concurrency.lockutils [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] Acquiring lock "refresh_cache-192ab790-d9db-4990-93d0-24603bd65016" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 741.399777] env[59534]: DEBUG oslo_concurrency.lockutils [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] Acquired lock "refresh_cache-192ab790-d9db-4990-93d0-24603bd65016" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 741.399894] env[59534]: DEBUG nova.network.neutron [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] [instance: 192ab790-d9db-4990-93d0-24603bd65016] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 741.410472] env[59534]: INFO nova.compute.manager [-] [instance: 57165716-986f-4582-ae73-e60fde250240] Took 0.28 seconds to deallocate network for instance. [ 741.412880] env[59534]: DEBUG nova.compute.claims [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] [instance: 57165716-986f-4582-ae73-e60fde250240] Aborting claim: {{(pid=59534) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 741.413063] env[59534]: DEBUG oslo_concurrency.lockutils [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 741.413273] env[59534]: DEBUG oslo_concurrency.lockutils [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 741.417362] env[59534]: DEBUG nova.compute.utils [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Using /dev/sd instead of None {{(pid=59534) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 741.418427] env[59534]: DEBUG nova.compute.manager [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] Allocating IP information in the background. {{(pid=59534) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 741.418592] env[59534]: DEBUG nova.network.neutron [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] allocate_for_instance() {{(pid=59534) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 741.427017] env[59534]: DEBUG nova.compute.manager [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] Start building block device mappings for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 741.489707] env[59534]: DEBUG nova.network.neutron [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] [instance: 192ab790-d9db-4990-93d0-24603bd65016] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 741.515508] env[59534]: DEBUG nova.compute.manager [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] Start spawning the instance on the hypervisor. {{(pid=59534) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 741.537686] env[59534]: DEBUG nova.virt.hardware [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T19:44:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T19:44:03Z,direct_url=,disk_format='vmdk',id=ca8542a2-3ba7-4624-b2be-cd49a340ac21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='31a329da9a7b4c98a2734a8492bf7dec',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T19:44:04Z,virtual_size=,visibility=), allow threads: False {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 741.537937] env[59534]: DEBUG nova.virt.hardware [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Flavor limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 741.538106] env[59534]: DEBUG nova.virt.hardware [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Image limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 741.538282] env[59534]: DEBUG nova.virt.hardware [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Flavor pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 741.538424] env[59534]: DEBUG nova.virt.hardware [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Image pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 741.538566] env[59534]: DEBUG nova.virt.hardware [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 741.538764] env[59534]: DEBUG nova.virt.hardware [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 741.538942] env[59534]: DEBUG nova.virt.hardware [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 741.539135] env[59534]: DEBUG nova.virt.hardware [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Got 1 possible topologies {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 741.539298] env[59534]: DEBUG nova.virt.hardware [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 741.539466] env[59534]: DEBUG nova.virt.hardware [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 741.541839] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16c992d6-ed9e-4987-9953-f43353c392a1 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 741.551207] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1e42d9e-8d42-413b-b2cc-9a87e5dd26cf {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 741.596220] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4e40a7a-7ad7-4ea7-87ae-361f90a6af76 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 741.604037] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61946219-f1b2-4684-8b93-1080cc152a50 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 741.636771] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80facbff-eb9b-47d5-af30-794cb17ac7bb {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 741.645462] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e478e85-c61e-466d-9627-b55b79f3c9a0 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 741.661066] env[59534]: DEBUG nova.compute.provider_tree [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 741.663501] env[59534]: DEBUG nova.policy [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '55181ba3de204d70a5bd1fc4e92ad17c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '61fe8d56a7ce444e87d2e7743b3a961f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59534) authorize /opt/stack/nova/nova/policy.py:203}} [ 741.675019] env[59534]: DEBUG nova.scheduler.client.report [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 741.689830] env[59534]: DEBUG oslo_concurrency.lockutils [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.276s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 741.690473] env[59534]: ERROR nova.compute.manager [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] [instance: 57165716-986f-4582-ae73-e60fde250240] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 364b7b07-d7bf-4503-b03c-8a3c16403c51, please check neutron logs for more information. [ 741.690473] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] Traceback (most recent call last): [ 741.690473] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 741.690473] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] self.driver.spawn(context, instance, image_meta, [ 741.690473] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 741.690473] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] self._vmops.spawn(context, instance, image_meta, injected_files, [ 741.690473] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 741.690473] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] vm_ref = self.build_virtual_machine(instance, [ 741.690473] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 741.690473] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] vif_infos = vmwarevif.get_vif_info(self._session, [ 741.690473] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 741.690815] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] for vif in network_info: [ 741.690815] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 741.690815] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] return self._sync_wrapper(fn, *args, **kwargs) [ 741.690815] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 741.690815] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] self.wait() [ 741.690815] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 741.690815] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] self[:] = self._gt.wait() [ 741.690815] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 741.690815] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] return self._exit_event.wait() [ 741.690815] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 741.690815] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] result = hub.switch() [ 741.690815] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 741.690815] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] return self.greenlet.switch() [ 741.690815] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 741.691218] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] result = function(*args, **kwargs) [ 741.691218] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 741.691218] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] return func(*args, **kwargs) [ 741.691218] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 741.691218] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] raise e [ 741.691218] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 741.691218] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] nwinfo = self.network_api.allocate_for_instance( [ 741.691218] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 741.691218] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] created_port_ids = self._update_ports_for_instance( [ 741.691218] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 741.691218] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] with excutils.save_and_reraise_exception(): [ 741.691218] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 741.691218] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] self.force_reraise() [ 741.691599] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 741.691599] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] raise self.value [ 741.691599] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 741.691599] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] updated_port = self._update_port( [ 741.691599] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 741.691599] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] _ensure_no_port_binding_failure(port) [ 741.691599] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 741.691599] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] raise exception.PortBindingFailed(port_id=port['id']) [ 741.691599] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] nova.exception.PortBindingFailed: Binding failed for port 364b7b07-d7bf-4503-b03c-8a3c16403c51, please check neutron logs for more information. [ 741.691599] env[59534]: ERROR nova.compute.manager [instance: 57165716-986f-4582-ae73-e60fde250240] [ 741.691599] env[59534]: DEBUG nova.compute.utils [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] [instance: 57165716-986f-4582-ae73-e60fde250240] Binding failed for port 364b7b07-d7bf-4503-b03c-8a3c16403c51, please check neutron logs for more information. {{(pid=59534) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 741.692922] env[59534]: DEBUG nova.compute.manager [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] [instance: 57165716-986f-4582-ae73-e60fde250240] Build of instance 57165716-986f-4582-ae73-e60fde250240 was re-scheduled: Binding failed for port 364b7b07-d7bf-4503-b03c-8a3c16403c51, please check neutron logs for more information. {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 741.693746] env[59534]: DEBUG nova.compute.manager [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] [instance: 57165716-986f-4582-ae73-e60fde250240] Unplugging VIFs for instance {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 741.693746] env[59534]: DEBUG oslo_concurrency.lockutils [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] Acquiring lock "refresh_cache-57165716-986f-4582-ae73-e60fde250240" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 741.693746] env[59534]: DEBUG oslo_concurrency.lockutils [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] Acquired lock "refresh_cache-57165716-986f-4582-ae73-e60fde250240" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 741.693890] env[59534]: DEBUG nova.network.neutron [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] [instance: 57165716-986f-4582-ae73-e60fde250240] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 741.773029] env[59534]: DEBUG nova.network.neutron [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] [instance: 57165716-986f-4582-ae73-e60fde250240] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 741.925544] env[59534]: DEBUG oslo_concurrency.lockutils [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] Acquiring lock "441810a5-3977-4c39-9c4f-3157678196a2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 741.925824] env[59534]: DEBUG oslo_concurrency.lockutils [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] Lock "441810a5-3977-4c39-9c4f-3157678196a2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 741.935195] env[59534]: DEBUG nova.compute.manager [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] [instance: 441810a5-3977-4c39-9c4f-3157678196a2] Starting instance... {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 741.985950] env[59534]: DEBUG oslo_concurrency.lockutils [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 741.986252] env[59534]: DEBUG oslo_concurrency.lockutils [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 741.987758] env[59534]: INFO nova.compute.claims [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] [instance: 441810a5-3977-4c39-9c4f-3157678196a2] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 742.033961] env[59534]: DEBUG nova.network.neutron [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] [instance: 192ab790-d9db-4990-93d0-24603bd65016] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 742.042094] env[59534]: DEBUG oslo_concurrency.lockutils [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] Releasing lock "refresh_cache-192ab790-d9db-4990-93d0-24603bd65016" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 742.042511] env[59534]: DEBUG nova.compute.manager [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] [instance: 192ab790-d9db-4990-93d0-24603bd65016] Start destroying the instance on the hypervisor. {{(pid=59534) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 742.042712] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] [instance: 192ab790-d9db-4990-93d0-24603bd65016] Destroying instance {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 742.043198] env[59534]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-448214aa-5c93-4ddc-b328-926622d777e5 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 742.056366] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b644096-da6e-4fdd-8824-05a4f1741495 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 742.088665] env[59534]: WARNING nova.virt.vmwareapi.vmops [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] [instance: 192ab790-d9db-4990-93d0-24603bd65016] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 192ab790-d9db-4990-93d0-24603bd65016 could not be found. [ 742.089062] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] [instance: 192ab790-d9db-4990-93d0-24603bd65016] Instance destroyed {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 742.089404] env[59534]: INFO nova.compute.manager [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] [instance: 192ab790-d9db-4990-93d0-24603bd65016] Took 0.05 seconds to destroy the instance on the hypervisor. [ 742.089815] env[59534]: DEBUG oslo.service.loopingcall [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59534) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 742.093687] env[59534]: DEBUG nova.compute.manager [-] [instance: 192ab790-d9db-4990-93d0-24603bd65016] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 742.093798] env[59534]: DEBUG nova.network.neutron [-] [instance: 192ab790-d9db-4990-93d0-24603bd65016] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 742.137018] env[59534]: DEBUG oslo_service.periodic_task [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59534) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 742.137018] env[59534]: DEBUG nova.compute.manager [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Starting heal instance info cache {{(pid=59534) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 742.137018] env[59534]: DEBUG nova.compute.manager [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Rebuilding the list of instances to heal {{(pid=59534) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 742.160741] env[59534]: DEBUG nova.compute.manager [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] Skipping network cache update for instance because it is Building. {{(pid=59534) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 742.160741] env[59534]: DEBUG nova.compute.manager [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] Skipping network cache update for instance because it is Building. {{(pid=59534) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 742.160741] env[59534]: DEBUG nova.compute.manager [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] [instance: 192ab790-d9db-4990-93d0-24603bd65016] Skipping network cache update for instance because it is Building. {{(pid=59534) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 742.160741] env[59534]: DEBUG nova.compute.manager [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] [instance: 744abb9c-379e-477b-a493-210cde36c314] Skipping network cache update for instance because it is Building. {{(pid=59534) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 742.160741] env[59534]: DEBUG nova.compute.manager [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] [instance: 953994f5-677e-40a8-8008-074630080334] Skipping network cache update for instance because it is Building. {{(pid=59534) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 742.161036] env[59534]: DEBUG nova.compute.manager [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] Skipping network cache update for instance because it is Building. {{(pid=59534) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 742.161036] env[59534]: DEBUG nova.compute.manager [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] [instance: 441810a5-3977-4c39-9c4f-3157678196a2] Skipping network cache update for instance because it is Building. {{(pid=59534) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 742.161036] env[59534]: DEBUG nova.compute.manager [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Didn't find any instances for network info cache update. {{(pid=59534) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 742.161036] env[59534]: DEBUG oslo_service.periodic_task [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59534) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 742.161036] env[59534]: DEBUG oslo_service.periodic_task [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59534) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 742.161036] env[59534]: DEBUG oslo_service.periodic_task [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59534) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 742.161245] env[59534]: DEBUG nova.compute.manager [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59534) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 742.182757] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7e1c316-eb50-4033-a6d8-13542c98fcb6 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 742.191360] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00dc7d1d-681a-4b89-8733-40854471e036 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 742.936655] env[59534]: ERROR nova.compute.manager [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port da9d1c71-80d0-4a68-88b1-7bb68e089ee6, please check neutron logs for more information. [ 742.936655] env[59534]: ERROR nova.compute.manager Traceback (most recent call last): [ 742.936655] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 742.936655] env[59534]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 742.936655] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 742.936655] env[59534]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 742.936655] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 742.936655] env[59534]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 742.936655] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 742.936655] env[59534]: ERROR nova.compute.manager self.force_reraise() [ 742.936655] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 742.936655] env[59534]: ERROR nova.compute.manager raise self.value [ 742.936655] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 742.936655] env[59534]: ERROR nova.compute.manager updated_port = self._update_port( [ 742.936655] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 742.936655] env[59534]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 742.937230] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 742.937230] env[59534]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 742.937230] env[59534]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port da9d1c71-80d0-4a68-88b1-7bb68e089ee6, please check neutron logs for more information. [ 742.937230] env[59534]: ERROR nova.compute.manager [ 742.937230] env[59534]: Traceback (most recent call last): [ 742.937230] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 742.937230] env[59534]: listener.cb(fileno) [ 742.937230] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 742.937230] env[59534]: result = function(*args, **kwargs) [ 742.937230] env[59534]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 742.937230] env[59534]: return func(*args, **kwargs) [ 742.937230] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 742.937230] env[59534]: raise e [ 742.937230] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 742.937230] env[59534]: nwinfo = self.network_api.allocate_for_instance( [ 742.937230] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 742.937230] env[59534]: created_port_ids = self._update_ports_for_instance( [ 742.937230] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 742.937230] env[59534]: with excutils.save_and_reraise_exception(): [ 742.937230] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 742.937230] env[59534]: self.force_reraise() [ 742.937230] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 742.937230] env[59534]: raise self.value [ 742.937230] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 742.937230] env[59534]: updated_port = self._update_port( [ 742.937230] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 742.937230] env[59534]: _ensure_no_port_binding_failure(port) [ 742.937230] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 742.937230] env[59534]: raise exception.PortBindingFailed(port_id=port['id']) [ 742.938132] env[59534]: nova.exception.PortBindingFailed: Binding failed for port da9d1c71-80d0-4a68-88b1-7bb68e089ee6, please check neutron logs for more information. [ 742.938132] env[59534]: Removing descriptor: 20 [ 742.938132] env[59534]: DEBUG nova.network.neutron [-] [instance: 192ab790-d9db-4990-93d0-24603bd65016] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 742.939109] env[59534]: DEBUG nova.network.neutron [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] [instance: 57165716-986f-4582-ae73-e60fde250240] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 742.945155] env[59534]: ERROR nova.compute.manager [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 744abb9c-379e-477b-a493-210cde36c314] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port da9d1c71-80d0-4a68-88b1-7bb68e089ee6, please check neutron logs for more information. [ 742.945155] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] Traceback (most recent call last): [ 742.945155] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 742.945155] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] yield resources [ 742.945155] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 742.945155] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] self.driver.spawn(context, instance, image_meta, [ 742.945155] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 742.945155] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] self._vmops.spawn(context, instance, image_meta, injected_files, [ 742.945155] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 742.945155] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] vm_ref = self.build_virtual_machine(instance, [ 742.945155] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 742.945731] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] vif_infos = vmwarevif.get_vif_info(self._session, [ 742.945731] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 742.945731] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] for vif in network_info: [ 742.945731] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 742.945731] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] return self._sync_wrapper(fn, *args, **kwargs) [ 742.945731] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 742.945731] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] self.wait() [ 742.945731] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 742.945731] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] self[:] = self._gt.wait() [ 742.945731] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 742.945731] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] return self._exit_event.wait() [ 742.945731] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 742.945731] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] result = hub.switch() [ 742.945731] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 742.949298] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] return self.greenlet.switch() [ 742.949298] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 742.949298] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] result = function(*args, **kwargs) [ 742.949298] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 742.949298] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] return func(*args, **kwargs) [ 742.949298] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 742.949298] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] raise e [ 742.949298] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 742.949298] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] nwinfo = self.network_api.allocate_for_instance( [ 742.949298] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 742.949298] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] created_port_ids = self._update_ports_for_instance( [ 742.949298] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 742.949298] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] with excutils.save_and_reraise_exception(): [ 742.949909] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 742.949909] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] self.force_reraise() [ 742.949909] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 742.949909] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] raise self.value [ 742.949909] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 742.949909] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] updated_port = self._update_port( [ 742.949909] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 742.949909] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] _ensure_no_port_binding_failure(port) [ 742.949909] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 742.949909] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] raise exception.PortBindingFailed(port_id=port['id']) [ 742.949909] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] nova.exception.PortBindingFailed: Binding failed for port da9d1c71-80d0-4a68-88b1-7bb68e089ee6, please check neutron logs for more information. [ 742.949909] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] [ 742.950223] env[59534]: INFO nova.compute.manager [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 744abb9c-379e-477b-a493-210cde36c314] Terminating instance [ 742.950223] env[59534]: DEBUG oslo_concurrency.lockutils [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Acquiring lock "eaeb9be3-b904-4e62-8ed3-301829c6a2ab" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 742.950223] env[59534]: DEBUG oslo_concurrency.lockutils [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Lock "eaeb9be3-b904-4e62-8ed3-301829c6a2ab" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 742.950223] env[59534]: DEBUG oslo_concurrency.lockutils [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Acquiring lock "ec3f2585-2dff-4bd8-97b9-ad9357835761" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 742.950381] env[59534]: DEBUG oslo_concurrency.lockutils [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Lock "ec3f2585-2dff-4bd8-97b9-ad9357835761" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 742.954024] env[59534]: DEBUG oslo_concurrency.lockutils [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Acquiring lock "refresh_cache-744abb9c-379e-477b-a493-210cde36c314" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 742.954024] env[59534]: DEBUG oslo_concurrency.lockutils [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Acquired lock "refresh_cache-744abb9c-379e-477b-a493-210cde36c314" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 742.954024] env[59534]: DEBUG nova.network.neutron [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 744abb9c-379e-477b-a493-210cde36c314] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 742.976741] env[59534]: DEBUG nova.network.neutron [-] [instance: 192ab790-d9db-4990-93d0-24603bd65016] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 742.981770] env[59534]: DEBUG oslo_concurrency.lockutils [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] Releasing lock "refresh_cache-57165716-986f-4582-ae73-e60fde250240" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 742.981959] env[59534]: DEBUG nova.compute.manager [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 742.982139] env[59534]: DEBUG nova.compute.manager [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] [instance: 57165716-986f-4582-ae73-e60fde250240] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 742.982300] env[59534]: DEBUG nova.network.neutron [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] [instance: 57165716-986f-4582-ae73-e60fde250240] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 742.983831] env[59534]: DEBUG nova.compute.manager [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] Starting instance... {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 742.986136] env[59534]: DEBUG nova.compute.manager [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] Starting instance... {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 742.988737] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d035b7f-423c-4444-b799-3b881fcd6e79 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 742.992756] env[59534]: INFO nova.compute.manager [-] [instance: 192ab790-d9db-4990-93d0-24603bd65016] Took 0.90 seconds to deallocate network for instance. [ 742.995344] env[59534]: DEBUG nova.compute.claims [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] [instance: 192ab790-d9db-4990-93d0-24603bd65016] Aborting claim: {{(pid=59534) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 742.995530] env[59534]: DEBUG oslo_concurrency.lockutils [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 742.999173] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91177ea5-c0d3-4364-b4dd-5fa28394b7f0 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.013536] env[59534]: DEBUG nova.compute.provider_tree [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 743.020898] env[59534]: DEBUG nova.network.neutron [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] Successfully created port: 210637bf-150e-4fa7-867a-baba46769887 {{(pid=59534) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 743.028111] env[59534]: DEBUG nova.scheduler.client.report [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 743.046247] env[59534]: DEBUG oslo_concurrency.lockutils [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 743.052766] env[59534]: DEBUG nova.network.neutron [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 744abb9c-379e-477b-a493-210cde36c314] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 743.061535] env[59534]: DEBUG oslo_concurrency.lockutils [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.075s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 743.061986] env[59534]: DEBUG nova.compute.manager [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] [instance: 441810a5-3977-4c39-9c4f-3157678196a2] Start building networks asynchronously for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 743.064800] env[59534]: DEBUG nova.network.neutron [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] [instance: 57165716-986f-4582-ae73-e60fde250240] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 743.067798] env[59534]: DEBUG oslo_concurrency.lockutils [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.072s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 743.077622] env[59534]: DEBUG nova.network.neutron [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] [instance: 57165716-986f-4582-ae73-e60fde250240] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 743.083254] env[59534]: DEBUG oslo_concurrency.lockutils [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 743.089916] env[59534]: INFO nova.compute.manager [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] [instance: 57165716-986f-4582-ae73-e60fde250240] Took 0.11 seconds to deallocate network for instance. [ 743.142700] env[59534]: DEBUG nova.compute.utils [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] Using /dev/sd instead of None {{(pid=59534) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 743.147172] env[59534]: DEBUG nova.compute.manager [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] [instance: 441810a5-3977-4c39-9c4f-3157678196a2] Allocating IP information in the background. {{(pid=59534) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 743.147172] env[59534]: DEBUG nova.network.neutron [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] [instance: 441810a5-3977-4c39-9c4f-3157678196a2] allocate_for_instance() {{(pid=59534) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 743.159428] env[59534]: DEBUG nova.compute.manager [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] [instance: 441810a5-3977-4c39-9c4f-3157678196a2] Start building block device mappings for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 743.232691] env[59534]: INFO nova.scheduler.client.report [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] Deleted allocations for instance 57165716-986f-4582-ae73-e60fde250240 [ 743.245138] env[59534]: DEBUG nova.compute.manager [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] [instance: 441810a5-3977-4c39-9c4f-3157678196a2] Start spawning the instance on the hypervisor. {{(pid=59534) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 743.253887] env[59534]: DEBUG oslo_concurrency.lockutils [None req-536d5a0f-9aea-422b-863e-90d5621382f4 tempest-ServerMetadataTestJSON-849325499 tempest-ServerMetadataTestJSON-849325499-project-member] Lock "57165716-986f-4582-ae73-e60fde250240" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 14.813s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 743.277053] env[59534]: DEBUG nova.virt.hardware [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T19:44:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T19:44:03Z,direct_url=,disk_format='vmdk',id=ca8542a2-3ba7-4624-b2be-cd49a340ac21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='31a329da9a7b4c98a2734a8492bf7dec',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T19:44:04Z,virtual_size=,visibility=), allow threads: False {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 743.277294] env[59534]: DEBUG nova.virt.hardware [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] Flavor limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 743.277443] env[59534]: DEBUG nova.virt.hardware [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] Image limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 743.277618] env[59534]: DEBUG nova.virt.hardware [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] Flavor pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 743.277784] env[59534]: DEBUG nova.virt.hardware [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] Image pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 743.277933] env[59534]: DEBUG nova.virt.hardware [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 743.278155] env[59534]: DEBUG nova.virt.hardware [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 743.279994] env[59534]: DEBUG nova.virt.hardware [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 743.279994] env[59534]: DEBUG nova.virt.hardware [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] Got 1 possible topologies {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 743.279994] env[59534]: DEBUG nova.virt.hardware [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 743.279994] env[59534]: DEBUG nova.virt.hardware [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 743.279994] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c782a5a0-9833-440e-b256-ab6cf98fef6b {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.291405] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c854344b-f695-47e7-a0d0-e9ee876796ef {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.355409] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-840c4c61-6876-4a19-aa34-6a7a21c63e30 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.362679] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23c5303b-d22a-440a-84ed-3318236af1e8 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.392669] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ffd3bfa-3837-4b1a-9ea0-f3afe3e2de5f {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.400386] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-974fb0ca-8d11-4f33-a65d-c095d0f608b8 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.415123] env[59534]: DEBUG nova.compute.provider_tree [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 743.427830] env[59534]: DEBUG nova.scheduler.client.report [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 743.439235] env[59534]: DEBUG nova.policy [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '36d75c18e68643fb96386273f98d6c08', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7c893f5f02774f5d8968f0ab9350daa1', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59534) authorize /opt/stack/nova/nova/policy.py:203}} [ 743.441789] env[59534]: DEBUG oslo_concurrency.lockutils [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.374s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 743.442383] env[59534]: ERROR nova.compute.manager [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] [instance: 192ab790-d9db-4990-93d0-24603bd65016] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 289d3b0f-d24b-44cd-8e43-fb69290023da, please check neutron logs for more information. [ 743.442383] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] Traceback (most recent call last): [ 743.442383] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 743.442383] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] self.driver.spawn(context, instance, image_meta, [ 743.442383] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 743.442383] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] self._vmops.spawn(context, instance, image_meta, injected_files, [ 743.442383] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 743.442383] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] vm_ref = self.build_virtual_machine(instance, [ 743.442383] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 743.442383] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] vif_infos = vmwarevif.get_vif_info(self._session, [ 743.442383] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 743.442702] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] for vif in network_info: [ 743.442702] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 743.442702] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] return self._sync_wrapper(fn, *args, **kwargs) [ 743.442702] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 743.442702] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] self.wait() [ 743.442702] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 743.442702] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] self[:] = self._gt.wait() [ 743.442702] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 743.442702] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] return self._exit_event.wait() [ 743.442702] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 743.442702] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] result = hub.switch() [ 743.442702] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 743.442702] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] return self.greenlet.switch() [ 743.442702] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 743.443114] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] result = function(*args, **kwargs) [ 743.443114] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 743.443114] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] return func(*args, **kwargs) [ 743.443114] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 743.443114] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] raise e [ 743.443114] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 743.443114] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] nwinfo = self.network_api.allocate_for_instance( [ 743.443114] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 743.443114] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] created_port_ids = self._update_ports_for_instance( [ 743.443114] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 743.443114] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] with excutils.save_and_reraise_exception(): [ 743.443114] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 743.443114] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] self.force_reraise() [ 743.443591] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 743.443591] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] raise self.value [ 743.443591] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 743.443591] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] updated_port = self._update_port( [ 743.443591] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 743.443591] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] _ensure_no_port_binding_failure(port) [ 743.443591] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 743.443591] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] raise exception.PortBindingFailed(port_id=port['id']) [ 743.443591] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] nova.exception.PortBindingFailed: Binding failed for port 289d3b0f-d24b-44cd-8e43-fb69290023da, please check neutron logs for more information. [ 743.443591] env[59534]: ERROR nova.compute.manager [instance: 192ab790-d9db-4990-93d0-24603bd65016] [ 743.443881] env[59534]: DEBUG nova.compute.utils [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] [instance: 192ab790-d9db-4990-93d0-24603bd65016] Binding failed for port 289d3b0f-d24b-44cd-8e43-fb69290023da, please check neutron logs for more information. {{(pid=59534) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 743.444190] env[59534]: DEBUG oslo_concurrency.lockutils [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.400s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 743.445499] env[59534]: INFO nova.compute.claims [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 743.448454] env[59534]: DEBUG nova.compute.manager [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] [instance: 192ab790-d9db-4990-93d0-24603bd65016] Build of instance 192ab790-d9db-4990-93d0-24603bd65016 was re-scheduled: Binding failed for port 289d3b0f-d24b-44cd-8e43-fb69290023da, please check neutron logs for more information. {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 743.448783] env[59534]: DEBUG nova.compute.manager [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] [instance: 192ab790-d9db-4990-93d0-24603bd65016] Unplugging VIFs for instance {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 743.449010] env[59534]: DEBUG oslo_concurrency.lockutils [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] Acquiring lock "refresh_cache-192ab790-d9db-4990-93d0-24603bd65016" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 743.449153] env[59534]: DEBUG oslo_concurrency.lockutils [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] Acquired lock "refresh_cache-192ab790-d9db-4990-93d0-24603bd65016" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 743.449304] env[59534]: DEBUG nova.network.neutron [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] [instance: 192ab790-d9db-4990-93d0-24603bd65016] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 743.546953] env[59534]: DEBUG nova.network.neutron [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] [instance: 192ab790-d9db-4990-93d0-24603bd65016] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 743.649370] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5306424a-6e6d-4fb7-8057-0c22217f7a5d {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.658811] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fef51abf-84f1-4546-9d2d-e2a2ca5179df {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.693610] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb4f3d7c-d152-4864-8551-a2caafa69812 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.702029] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65db5dd0-75ce-440a-b27e-30c8097bcef6 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.722047] env[59534]: DEBUG nova.compute.provider_tree [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 743.733325] env[59534]: DEBUG nova.scheduler.client.report [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 743.756097] env[59534]: DEBUG oslo_concurrency.lockutils [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.312s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 743.756943] env[59534]: DEBUG nova.compute.manager [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] Start building networks asynchronously for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 743.762312] env[59534]: DEBUG oslo_concurrency.lockutils [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.678s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 743.762312] env[59534]: INFO nova.compute.claims [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 743.767228] env[59534]: ERROR nova.compute.manager [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 900fb54a-0dce-428f-8c20-ddc4a7315725, please check neutron logs for more information. [ 743.767228] env[59534]: ERROR nova.compute.manager Traceback (most recent call last): [ 743.767228] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 743.767228] env[59534]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 743.767228] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 743.767228] env[59534]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 743.767228] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 743.767228] env[59534]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 743.767228] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 743.767228] env[59534]: ERROR nova.compute.manager self.force_reraise() [ 743.767228] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 743.767228] env[59534]: ERROR nova.compute.manager raise self.value [ 743.767228] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 743.767228] env[59534]: ERROR nova.compute.manager updated_port = self._update_port( [ 743.767228] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 743.767228] env[59534]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 743.768158] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 743.768158] env[59534]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 743.768158] env[59534]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 900fb54a-0dce-428f-8c20-ddc4a7315725, please check neutron logs for more information. [ 743.768158] env[59534]: ERROR nova.compute.manager [ 743.768158] env[59534]: Traceback (most recent call last): [ 743.768158] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 743.768158] env[59534]: listener.cb(fileno) [ 743.768158] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 743.768158] env[59534]: result = function(*args, **kwargs) [ 743.768158] env[59534]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 743.768158] env[59534]: return func(*args, **kwargs) [ 743.768158] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 743.768158] env[59534]: raise e [ 743.768158] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 743.768158] env[59534]: nwinfo = self.network_api.allocate_for_instance( [ 743.768158] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 743.768158] env[59534]: created_port_ids = self._update_ports_for_instance( [ 743.768158] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 743.768158] env[59534]: with excutils.save_and_reraise_exception(): [ 743.768158] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 743.768158] env[59534]: self.force_reraise() [ 743.768158] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 743.768158] env[59534]: raise self.value [ 743.768158] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 743.768158] env[59534]: updated_port = self._update_port( [ 743.768158] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 743.768158] env[59534]: _ensure_no_port_binding_failure(port) [ 743.768158] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 743.768158] env[59534]: raise exception.PortBindingFailed(port_id=port['id']) [ 743.768864] env[59534]: nova.exception.PortBindingFailed: Binding failed for port 900fb54a-0dce-428f-8c20-ddc4a7315725, please check neutron logs for more information. [ 743.768864] env[59534]: Removing descriptor: 15 [ 743.768864] env[59534]: ERROR nova.compute.manager [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 953994f5-677e-40a8-8008-074630080334] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 900fb54a-0dce-428f-8c20-ddc4a7315725, please check neutron logs for more information. [ 743.768864] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] Traceback (most recent call last): [ 743.768864] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 743.768864] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] yield resources [ 743.768864] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 743.768864] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] self.driver.spawn(context, instance, image_meta, [ 743.768864] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 743.768864] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] self._vmops.spawn(context, instance, image_meta, injected_files, [ 743.768864] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 743.768864] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] vm_ref = self.build_virtual_machine(instance, [ 743.769825] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 743.769825] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] vif_infos = vmwarevif.get_vif_info(self._session, [ 743.769825] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 743.769825] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] for vif in network_info: [ 743.769825] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 743.769825] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] return self._sync_wrapper(fn, *args, **kwargs) [ 743.769825] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 743.769825] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] self.wait() [ 743.769825] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 743.769825] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] self[:] = self._gt.wait() [ 743.769825] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 743.769825] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] return self._exit_event.wait() [ 743.769825] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 743.769825] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] result = hub.switch() [ 743.770212] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 743.770212] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] return self.greenlet.switch() [ 743.770212] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 743.770212] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] result = function(*args, **kwargs) [ 743.770212] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 743.770212] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] return func(*args, **kwargs) [ 743.770212] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 743.770212] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] raise e [ 743.770212] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 743.770212] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] nwinfo = self.network_api.allocate_for_instance( [ 743.770212] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 743.770212] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] created_port_ids = self._update_ports_for_instance( [ 743.770212] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 743.770581] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] with excutils.save_and_reraise_exception(): [ 743.770581] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 743.770581] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] self.force_reraise() [ 743.770581] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 743.770581] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] raise self.value [ 743.770581] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 743.770581] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] updated_port = self._update_port( [ 743.770581] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 743.770581] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] _ensure_no_port_binding_failure(port) [ 743.770581] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 743.770581] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] raise exception.PortBindingFailed(port_id=port['id']) [ 743.770581] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] nova.exception.PortBindingFailed: Binding failed for port 900fb54a-0dce-428f-8c20-ddc4a7315725, please check neutron logs for more information. [ 743.770581] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] [ 743.770934] env[59534]: INFO nova.compute.manager [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 953994f5-677e-40a8-8008-074630080334] Terminating instance [ 743.770934] env[59534]: DEBUG oslo_concurrency.lockutils [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Acquiring lock "refresh_cache-953994f5-677e-40a8-8008-074630080334" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 743.770934] env[59534]: DEBUG oslo_concurrency.lockutils [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Acquired lock "refresh_cache-953994f5-677e-40a8-8008-074630080334" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 743.770934] env[59534]: DEBUG nova.network.neutron [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 953994f5-677e-40a8-8008-074630080334] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 743.799630] env[59534]: DEBUG nova.compute.utils [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Using /dev/sd instead of None {{(pid=59534) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 743.805210] env[59534]: DEBUG nova.compute.manager [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] Allocating IP information in the background. {{(pid=59534) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 743.805210] env[59534]: DEBUG nova.network.neutron [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] allocate_for_instance() {{(pid=59534) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 743.809886] env[59534]: DEBUG nova.compute.manager [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] Start building block device mappings for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 743.817367] env[59534]: DEBUG nova.network.neutron [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 953994f5-677e-40a8-8008-074630080334] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 743.881766] env[59534]: DEBUG nova.network.neutron [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 744abb9c-379e-477b-a493-210cde36c314] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 743.892928] env[59534]: DEBUG nova.compute.manager [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] Start spawning the instance on the hypervisor. {{(pid=59534) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 743.896401] env[59534]: DEBUG oslo_concurrency.lockutils [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Releasing lock "refresh_cache-744abb9c-379e-477b-a493-210cde36c314" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 743.897213] env[59534]: DEBUG nova.compute.manager [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 744abb9c-379e-477b-a493-210cde36c314] Start destroying the instance on the hypervisor. {{(pid=59534) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 743.897213] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 744abb9c-379e-477b-a493-210cde36c314] Destroying instance {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 743.897402] env[59534]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-6f80c133-eed6-4123-b641-49a4a6b86a46 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.916887] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96f14459-4e4f-4874-9c02-82967b26c85e {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.939556] env[59534]: DEBUG nova.virt.hardware [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T19:44:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T19:44:03Z,direct_url=,disk_format='vmdk',id=ca8542a2-3ba7-4624-b2be-cd49a340ac21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='31a329da9a7b4c98a2734a8492bf7dec',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T19:44:04Z,virtual_size=,visibility=), allow threads: False {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 743.939822] env[59534]: DEBUG nova.virt.hardware [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Flavor limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 743.940085] env[59534]: DEBUG nova.virt.hardware [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Image limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 743.941145] env[59534]: DEBUG nova.virt.hardware [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Flavor pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 743.941330] env[59534]: DEBUG nova.virt.hardware [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Image pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 743.941483] env[59534]: DEBUG nova.virt.hardware [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 743.941695] env[59534]: DEBUG nova.virt.hardware [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 743.941854] env[59534]: DEBUG nova.virt.hardware [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 743.942287] env[59534]: DEBUG nova.virt.hardware [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Got 1 possible topologies {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 743.942287] env[59534]: DEBUG nova.virt.hardware [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 743.942467] env[59534]: DEBUG nova.virt.hardware [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 743.943174] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aeb74799-a6ae-4a64-b8dc-1a14edcc0418 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.954089] env[59534]: WARNING nova.virt.vmwareapi.vmops [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 744abb9c-379e-477b-a493-210cde36c314] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 744abb9c-379e-477b-a493-210cde36c314 could not be found. [ 743.954194] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 744abb9c-379e-477b-a493-210cde36c314] Instance destroyed {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 743.954349] env[59534]: INFO nova.compute.manager [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 744abb9c-379e-477b-a493-210cde36c314] Took 0.06 seconds to destroy the instance on the hypervisor. [ 743.954589] env[59534]: DEBUG oslo.service.loopingcall [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59534) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 743.957280] env[59534]: DEBUG nova.compute.manager [-] [instance: 744abb9c-379e-477b-a493-210cde36c314] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 743.957280] env[59534]: DEBUG nova.network.neutron [-] [instance: 744abb9c-379e-477b-a493-210cde36c314] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 743.961010] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a4feee8-7209-41a3-aba0-0c68bcb96d79 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.982115] env[59534]: DEBUG nova.policy [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '00bb3812655e400ea181ea219b316f42', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e63775eb8bc34a52a2b1e79624c417e8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59534) authorize /opt/stack/nova/nova/policy.py:203}} [ 743.991238] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c32b8e5-9f07-4c4f-b645-534df8fabaec {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.998989] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d54c34e-c1bb-4f02-a609-5d03215333b4 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.032745] env[59534]: DEBUG nova.network.neutron [-] [instance: 744abb9c-379e-477b-a493-210cde36c314] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 744.036114] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e42a6336-423d-4d95-85f1-f1227e842a9b {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.046019] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d22f0e81-947e-4b71-b9da-77bb4ed34cde {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.049671] env[59534]: DEBUG nova.network.neutron [-] [instance: 744abb9c-379e-477b-a493-210cde36c314] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 744.061427] env[59534]: DEBUG nova.compute.provider_tree [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 744.065022] env[59534]: INFO nova.compute.manager [-] [instance: 744abb9c-379e-477b-a493-210cde36c314] Took 0.11 seconds to deallocate network for instance. [ 744.065022] env[59534]: DEBUG nova.compute.claims [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 744abb9c-379e-477b-a493-210cde36c314] Aborting claim: {{(pid=59534) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 744.065183] env[59534]: DEBUG oslo_concurrency.lockutils [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 744.071267] env[59534]: DEBUG nova.scheduler.client.report [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 744.086979] env[59534]: DEBUG oslo_concurrency.lockutils [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.326s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 744.087493] env[59534]: DEBUG nova.compute.manager [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] Start building networks asynchronously for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 744.090166] env[59534]: DEBUG oslo_concurrency.lockutils [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.025s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 744.126339] env[59534]: DEBUG nova.compute.utils [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Using /dev/sd instead of None {{(pid=59534) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 744.127787] env[59534]: DEBUG nova.compute.manager [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] Allocating IP information in the background. {{(pid=59534) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 744.128141] env[59534]: DEBUG nova.network.neutron [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] allocate_for_instance() {{(pid=59534) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 744.136575] env[59534]: DEBUG nova.compute.manager [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] Start building block device mappings for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 744.249623] env[59534]: DEBUG nova.compute.manager [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] Start spawning the instance on the hypervisor. {{(pid=59534) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 744.277599] env[59534]: DEBUG nova.virt.hardware [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T19:44:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T19:44:03Z,direct_url=,disk_format='vmdk',id=ca8542a2-3ba7-4624-b2be-cd49a340ac21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='31a329da9a7b4c98a2734a8492bf7dec',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T19:44:04Z,virtual_size=,visibility=), allow threads: False {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 744.277896] env[59534]: DEBUG nova.virt.hardware [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Flavor limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 744.278081] env[59534]: DEBUG nova.virt.hardware [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Image limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 744.278265] env[59534]: DEBUG nova.virt.hardware [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Flavor pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 744.278408] env[59534]: DEBUG nova.virt.hardware [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Image pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 744.278550] env[59534]: DEBUG nova.virt.hardware [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 744.278755] env[59534]: DEBUG nova.virt.hardware [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 744.278909] env[59534]: DEBUG nova.virt.hardware [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 744.279551] env[59534]: DEBUG nova.virt.hardware [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Got 1 possible topologies {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 744.279813] env[59534]: DEBUG nova.virt.hardware [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 744.280050] env[59534]: DEBUG nova.virt.hardware [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 744.281051] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7400543-38dd-4206-9ac6-a97eca040e79 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.286971] env[59534]: DEBUG nova.network.neutron [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] [instance: 192ab790-d9db-4990-93d0-24603bd65016] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 744.295606] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-881e3c62-22f7-4ef5-a59a-d86307212cef {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.300575] env[59534]: DEBUG oslo_concurrency.lockutils [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] Releasing lock "refresh_cache-192ab790-d9db-4990-93d0-24603bd65016" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 744.300831] env[59534]: DEBUG nova.compute.manager [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 744.301089] env[59534]: DEBUG nova.compute.manager [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] [instance: 192ab790-d9db-4990-93d0-24603bd65016] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 744.301293] env[59534]: DEBUG nova.network.neutron [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] [instance: 192ab790-d9db-4990-93d0-24603bd65016] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 744.303740] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b602d1a-5472-4259-ab61-eba4e1c0de2c {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.319640] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8afd8494-2f24-409a-bbae-9e4a6b9f5701 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.353932] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d481c9c0-8706-4291-85bc-1abe9e7f0e8e {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.362605] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98c61500-c0f0-40d6-a2f1-e46191e3219e {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.377951] env[59534]: DEBUG nova.compute.provider_tree [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 744.388819] env[59534]: DEBUG nova.scheduler.client.report [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 744.403996] env[59534]: DEBUG oslo_concurrency.lockutils [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.314s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 744.404718] env[59534]: ERROR nova.compute.manager [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 744abb9c-379e-477b-a493-210cde36c314] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port da9d1c71-80d0-4a68-88b1-7bb68e089ee6, please check neutron logs for more information. [ 744.404718] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] Traceback (most recent call last): [ 744.404718] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 744.404718] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] self.driver.spawn(context, instance, image_meta, [ 744.404718] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 744.404718] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] self._vmops.spawn(context, instance, image_meta, injected_files, [ 744.404718] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 744.404718] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] vm_ref = self.build_virtual_machine(instance, [ 744.404718] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 744.404718] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] vif_infos = vmwarevif.get_vif_info(self._session, [ 744.404718] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 744.405072] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] for vif in network_info: [ 744.405072] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 744.405072] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] return self._sync_wrapper(fn, *args, **kwargs) [ 744.405072] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 744.405072] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] self.wait() [ 744.405072] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 744.405072] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] self[:] = self._gt.wait() [ 744.405072] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 744.405072] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] return self._exit_event.wait() [ 744.405072] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 744.405072] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] result = hub.switch() [ 744.405072] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 744.405072] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] return self.greenlet.switch() [ 744.405072] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 744.405453] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] result = function(*args, **kwargs) [ 744.405453] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 744.405453] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] return func(*args, **kwargs) [ 744.405453] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 744.405453] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] raise e [ 744.405453] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 744.405453] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] nwinfo = self.network_api.allocate_for_instance( [ 744.405453] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 744.405453] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] created_port_ids = self._update_ports_for_instance( [ 744.405453] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 744.405453] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] with excutils.save_and_reraise_exception(): [ 744.405453] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 744.405453] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] self.force_reraise() [ 744.405793] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 744.405793] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] raise self.value [ 744.405793] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 744.405793] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] updated_port = self._update_port( [ 744.405793] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 744.405793] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] _ensure_no_port_binding_failure(port) [ 744.405793] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 744.405793] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] raise exception.PortBindingFailed(port_id=port['id']) [ 744.405793] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] nova.exception.PortBindingFailed: Binding failed for port da9d1c71-80d0-4a68-88b1-7bb68e089ee6, please check neutron logs for more information. [ 744.405793] env[59534]: ERROR nova.compute.manager [instance: 744abb9c-379e-477b-a493-210cde36c314] [ 744.406136] env[59534]: DEBUG nova.compute.utils [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 744abb9c-379e-477b-a493-210cde36c314] Binding failed for port da9d1c71-80d0-4a68-88b1-7bb68e089ee6, please check neutron logs for more information. {{(pid=59534) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 744.407461] env[59534]: DEBUG nova.compute.manager [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 744abb9c-379e-477b-a493-210cde36c314] Build of instance 744abb9c-379e-477b-a493-210cde36c314 was re-scheduled: Binding failed for port da9d1c71-80d0-4a68-88b1-7bb68e089ee6, please check neutron logs for more information. {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 744.408091] env[59534]: DEBUG nova.compute.manager [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 744abb9c-379e-477b-a493-210cde36c314] Unplugging VIFs for instance {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 744.408364] env[59534]: DEBUG oslo_concurrency.lockutils [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Acquiring lock "refresh_cache-744abb9c-379e-477b-a493-210cde36c314" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 744.408575] env[59534]: DEBUG oslo_concurrency.lockutils [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Acquired lock "refresh_cache-744abb9c-379e-477b-a493-210cde36c314" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 744.408780] env[59534]: DEBUG nova.network.neutron [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 744abb9c-379e-477b-a493-210cde36c314] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 744.416024] env[59534]: DEBUG nova.policy [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '00bb3812655e400ea181ea219b316f42', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e63775eb8bc34a52a2b1e79624c417e8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59534) authorize /opt/stack/nova/nova/policy.py:203}} [ 744.421410] env[59534]: DEBUG nova.network.neutron [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 953994f5-677e-40a8-8008-074630080334] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 744.431968] env[59534]: DEBUG oslo_concurrency.lockutils [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Releasing lock "refresh_cache-953994f5-677e-40a8-8008-074630080334" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 744.432363] env[59534]: DEBUG nova.compute.manager [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 953994f5-677e-40a8-8008-074630080334] Start destroying the instance on the hypervisor. {{(pid=59534) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 744.432575] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 953994f5-677e-40a8-8008-074630080334] Destroying instance {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 744.433083] env[59534]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-ac4fec06-d87d-4db5-a24d-f1b86fd893a5 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.436943] env[59534]: DEBUG nova.network.neutron [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] [instance: 192ab790-d9db-4990-93d0-24603bd65016] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 744.442801] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4eb2203-a8ff-44ba-aa61-c11e6fa5892c {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.455389] env[59534]: DEBUG nova.network.neutron [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] [instance: 192ab790-d9db-4990-93d0-24603bd65016] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 744.470196] env[59534]: INFO nova.compute.manager [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] [instance: 192ab790-d9db-4990-93d0-24603bd65016] Took 0.17 seconds to deallocate network for instance. [ 744.474069] env[59534]: WARNING nova.virt.vmwareapi.vmops [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 953994f5-677e-40a8-8008-074630080334] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 953994f5-677e-40a8-8008-074630080334 could not be found. [ 744.474069] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 953994f5-677e-40a8-8008-074630080334] Instance destroyed {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 744.474069] env[59534]: INFO nova.compute.manager [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 953994f5-677e-40a8-8008-074630080334] Took 0.04 seconds to destroy the instance on the hypervisor. [ 744.474069] env[59534]: DEBUG oslo.service.loopingcall [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59534) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 744.474069] env[59534]: DEBUG nova.compute.manager [-] [instance: 953994f5-677e-40a8-8008-074630080334] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 744.474258] env[59534]: DEBUG nova.network.neutron [-] [instance: 953994f5-677e-40a8-8008-074630080334] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 744.504448] env[59534]: DEBUG nova.network.neutron [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 744abb9c-379e-477b-a493-210cde36c314] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 744.562098] env[59534]: DEBUG nova.network.neutron [-] [instance: 953994f5-677e-40a8-8008-074630080334] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 744.572633] env[59534]: DEBUG nova.network.neutron [-] [instance: 953994f5-677e-40a8-8008-074630080334] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 744.577030] env[59534]: INFO nova.scheduler.client.report [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] Deleted allocations for instance 192ab790-d9db-4990-93d0-24603bd65016 [ 744.588442] env[59534]: INFO nova.compute.manager [-] [instance: 953994f5-677e-40a8-8008-074630080334] Took 0.11 seconds to deallocate network for instance. [ 744.593057] env[59534]: DEBUG nova.compute.claims [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 953994f5-677e-40a8-8008-074630080334] Aborting claim: {{(pid=59534) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 744.593057] env[59534]: DEBUG oslo_concurrency.lockutils [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 744.593057] env[59534]: DEBUG oslo_concurrency.lockutils [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 744.601713] env[59534]: DEBUG oslo_concurrency.lockutils [None req-e649ce2a-c2c2-4c8b-bc44-e8c0a02235f6 tempest-ServersNegativeTestMultiTenantJSON-528770027 tempest-ServersNegativeTestMultiTenantJSON-528770027-project-member] Lock "192ab790-d9db-4990-93d0-24603bd65016" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 14.096s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 744.795496] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b3f11c9-7ee8-4d17-9376-26857f56eed2 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.803395] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-deba7f11-dec6-40f1-910d-372d6a9c9ef6 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.836290] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b294e664-2763-4ef9-9551-4b057ae4d0a8 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.841223] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04eaf99e-2c32-417a-a836-0811429ac6a8 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.855080] env[59534]: DEBUG nova.compute.provider_tree [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 744.867309] env[59534]: DEBUG nova.scheduler.client.report [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 744.874962] env[59534]: DEBUG nova.network.neutron [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] Successfully created port: db5dff77-a34d-4c6b-9be4-52911ee6ca15 {{(pid=59534) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 744.892427] env[59534]: DEBUG oslo_concurrency.lockutils [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.302s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 744.893135] env[59534]: ERROR nova.compute.manager [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 953994f5-677e-40a8-8008-074630080334] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 900fb54a-0dce-428f-8c20-ddc4a7315725, please check neutron logs for more information. [ 744.893135] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] Traceback (most recent call last): [ 744.893135] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 744.893135] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] self.driver.spawn(context, instance, image_meta, [ 744.893135] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 744.893135] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] self._vmops.spawn(context, instance, image_meta, injected_files, [ 744.893135] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 744.893135] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] vm_ref = self.build_virtual_machine(instance, [ 744.893135] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 744.893135] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] vif_infos = vmwarevif.get_vif_info(self._session, [ 744.893135] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 744.893484] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] for vif in network_info: [ 744.893484] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 744.893484] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] return self._sync_wrapper(fn, *args, **kwargs) [ 744.893484] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 744.893484] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] self.wait() [ 744.893484] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 744.893484] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] self[:] = self._gt.wait() [ 744.893484] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 744.893484] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] return self._exit_event.wait() [ 744.893484] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 744.893484] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] result = hub.switch() [ 744.893484] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 744.893484] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] return self.greenlet.switch() [ 744.893484] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 744.893853] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] result = function(*args, **kwargs) [ 744.893853] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 744.893853] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] return func(*args, **kwargs) [ 744.893853] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 744.893853] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] raise e [ 744.893853] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 744.893853] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] nwinfo = self.network_api.allocate_for_instance( [ 744.893853] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 744.893853] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] created_port_ids = self._update_ports_for_instance( [ 744.893853] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 744.893853] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] with excutils.save_and_reraise_exception(): [ 744.893853] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 744.893853] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] self.force_reraise() [ 744.894221] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 744.894221] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] raise self.value [ 744.894221] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 744.894221] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] updated_port = self._update_port( [ 744.894221] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 744.894221] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] _ensure_no_port_binding_failure(port) [ 744.894221] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 744.894221] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] raise exception.PortBindingFailed(port_id=port['id']) [ 744.894221] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] nova.exception.PortBindingFailed: Binding failed for port 900fb54a-0dce-428f-8c20-ddc4a7315725, please check neutron logs for more information. [ 744.894221] env[59534]: ERROR nova.compute.manager [instance: 953994f5-677e-40a8-8008-074630080334] [ 744.894221] env[59534]: DEBUG nova.compute.utils [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 953994f5-677e-40a8-8008-074630080334] Binding failed for port 900fb54a-0dce-428f-8c20-ddc4a7315725, please check neutron logs for more information. {{(pid=59534) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 744.895445] env[59534]: DEBUG nova.compute.manager [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 953994f5-677e-40a8-8008-074630080334] Build of instance 953994f5-677e-40a8-8008-074630080334 was re-scheduled: Binding failed for port 900fb54a-0dce-428f-8c20-ddc4a7315725, please check neutron logs for more information. {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 744.896305] env[59534]: DEBUG nova.compute.manager [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 953994f5-677e-40a8-8008-074630080334] Unplugging VIFs for instance {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 744.896305] env[59534]: DEBUG oslo_concurrency.lockutils [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Acquiring lock "refresh_cache-953994f5-677e-40a8-8008-074630080334" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 744.896305] env[59534]: DEBUG oslo_concurrency.lockutils [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Acquired lock "refresh_cache-953994f5-677e-40a8-8008-074630080334" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 744.896788] env[59534]: DEBUG nova.network.neutron [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 953994f5-677e-40a8-8008-074630080334] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 745.137417] env[59534]: DEBUG nova.network.neutron [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 953994f5-677e-40a8-8008-074630080334] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 745.292605] env[59534]: DEBUG nova.network.neutron [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] Successfully created port: 7182bcb0-1e0c-4962-b068-0066cf52d5b6 {{(pid=59534) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 745.356255] env[59534]: DEBUG nova.network.neutron [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 744abb9c-379e-477b-a493-210cde36c314] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 745.368432] env[59534]: DEBUG oslo_concurrency.lockutils [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Releasing lock "refresh_cache-744abb9c-379e-477b-a493-210cde36c314" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 745.368666] env[59534]: DEBUG nova.compute.manager [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 745.371018] env[59534]: DEBUG nova.compute.manager [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 744abb9c-379e-477b-a493-210cde36c314] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 745.371018] env[59534]: DEBUG nova.network.neutron [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 744abb9c-379e-477b-a493-210cde36c314] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 745.448759] env[59534]: DEBUG nova.network.neutron [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 744abb9c-379e-477b-a493-210cde36c314] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 745.463151] env[59534]: DEBUG nova.network.neutron [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 744abb9c-379e-477b-a493-210cde36c314] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 745.476675] env[59534]: INFO nova.compute.manager [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] [instance: 744abb9c-379e-477b-a493-210cde36c314] Took 0.11 seconds to deallocate network for instance. [ 745.597458] env[59534]: INFO nova.scheduler.client.report [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Deleted allocations for instance 744abb9c-379e-477b-a493-210cde36c314 [ 745.623065] env[59534]: DEBUG oslo_concurrency.lockutils [None req-9994dfce-e774-4e95-bf5b-67baa5d03764 tempest-ImagesTestJSON-826348309 tempest-ImagesTestJSON-826348309-project-member] Lock "744abb9c-379e-477b-a493-210cde36c314" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 14.719s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 745.658109] env[59534]: DEBUG nova.network.neutron [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] [instance: 441810a5-3977-4c39-9c4f-3157678196a2] Successfully created port: e766518a-f477-4b12-be50-d6060b46af36 {{(pid=59534) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 745.778197] env[59534]: DEBUG nova.network.neutron [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 953994f5-677e-40a8-8008-074630080334] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 745.787871] env[59534]: DEBUG oslo_concurrency.lockutils [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Releasing lock "refresh_cache-953994f5-677e-40a8-8008-074630080334" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 745.788423] env[59534]: DEBUG nova.compute.manager [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 745.788629] env[59534]: DEBUG nova.compute.manager [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 953994f5-677e-40a8-8008-074630080334] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 745.788791] env[59534]: DEBUG nova.network.neutron [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 953994f5-677e-40a8-8008-074630080334] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 745.861524] env[59534]: DEBUG nova.network.neutron [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 953994f5-677e-40a8-8008-074630080334] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 745.870805] env[59534]: DEBUG nova.network.neutron [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 953994f5-677e-40a8-8008-074630080334] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 745.880493] env[59534]: INFO nova.compute.manager [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] [instance: 953994f5-677e-40a8-8008-074630080334] Took 0.09 seconds to deallocate network for instance. [ 745.974384] env[59534]: INFO nova.scheduler.client.report [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Deleted allocations for instance 953994f5-677e-40a8-8008-074630080334 [ 745.992744] env[59534]: DEBUG oslo_concurrency.lockutils [None req-7291a166-2af6-4c53-bb97-62cec487b32e tempest-DeleteServersTestJSON-824782706 tempest-DeleteServersTestJSON-824782706-project-member] Lock "953994f5-677e-40a8-8008-074630080334" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 7.467s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 746.733829] env[59534]: DEBUG nova.network.neutron [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] Successfully created port: df189034-7960-4099-b3ee-4cea8604709a {{(pid=59534) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 747.079852] env[59534]: DEBUG oslo_concurrency.lockutils [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Acquiring lock "38811b76-3497-44ff-8569-fb1e5c3952bf" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 747.080144] env[59534]: DEBUG oslo_concurrency.lockutils [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Lock "38811b76-3497-44ff-8569-fb1e5c3952bf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 747.090761] env[59534]: DEBUG nova.compute.manager [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] Starting instance... {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 747.137844] env[59534]: DEBUG oslo_concurrency.lockutils [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 747.138159] env[59534]: DEBUG oslo_concurrency.lockutils [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 747.139660] env[59534]: INFO nova.compute.claims [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 747.293078] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a5749b3-8515-4219-a4f4-5d409fb19209 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 747.301468] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-331d93a8-3fe0-4c24-8177-6a26e7f66f6a {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 747.333094] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c2334fb-18c6-469c-95a6-5e499bc90e43 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 747.340933] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5ccfb44-d8dd-4eb2-9e00-3e0947ebb4cd {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 747.355605] env[59534]: DEBUG nova.compute.provider_tree [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 747.366719] env[59534]: DEBUG nova.scheduler.client.report [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 747.386308] env[59534]: DEBUG oslo_concurrency.lockutils [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.248s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 747.386898] env[59534]: DEBUG nova.compute.manager [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] Start building networks asynchronously for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 747.422196] env[59534]: DEBUG nova.compute.utils [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Using /dev/sd instead of None {{(pid=59534) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 747.423550] env[59534]: DEBUG nova.compute.manager [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] Allocating IP information in the background. {{(pid=59534) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 747.423742] env[59534]: DEBUG nova.network.neutron [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] allocate_for_instance() {{(pid=59534) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 747.433305] env[59534]: DEBUG nova.compute.manager [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] Start building block device mappings for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 747.507194] env[59534]: DEBUG nova.compute.manager [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] Start spawning the instance on the hypervisor. {{(pid=59534) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 747.529966] env[59534]: DEBUG nova.virt.hardware [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T19:44:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T19:44:03Z,direct_url=,disk_format='vmdk',id=ca8542a2-3ba7-4624-b2be-cd49a340ac21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='31a329da9a7b4c98a2734a8492bf7dec',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T19:44:04Z,virtual_size=,visibility=), allow threads: False {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 747.529966] env[59534]: DEBUG nova.virt.hardware [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Flavor limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 747.529966] env[59534]: DEBUG nova.virt.hardware [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Image limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 747.530456] env[59534]: DEBUG nova.virt.hardware [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Flavor pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 747.530880] env[59534]: DEBUG nova.virt.hardware [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Image pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 747.534069] env[59534]: DEBUG nova.virt.hardware [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 747.534069] env[59534]: DEBUG nova.virt.hardware [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 747.534069] env[59534]: DEBUG nova.virt.hardware [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 747.534069] env[59534]: DEBUG nova.virt.hardware [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Got 1 possible topologies {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 747.534069] env[59534]: DEBUG nova.virt.hardware [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 747.534395] env[59534]: DEBUG nova.virt.hardware [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 747.534395] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3652eaf-d34a-4ec8-b003-07dd0450870c {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 747.542302] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6bf1eb28-5f8e-4d7b-ac84-a55de44b5da9 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 747.834617] env[59534]: DEBUG nova.policy [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6f0a9d41caa34cf8a03d948175f148f5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b53e0ad3e7a2485ba3547c57d84748ef', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59534) authorize /opt/stack/nova/nova/policy.py:203}} [ 748.986659] env[59534]: ERROR nova.compute.manager [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port db5dff77-a34d-4c6b-9be4-52911ee6ca15, please check neutron logs for more information. [ 748.986659] env[59534]: ERROR nova.compute.manager Traceback (most recent call last): [ 748.986659] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 748.986659] env[59534]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 748.986659] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 748.986659] env[59534]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 748.986659] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 748.986659] env[59534]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 748.986659] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 748.986659] env[59534]: ERROR nova.compute.manager self.force_reraise() [ 748.986659] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 748.986659] env[59534]: ERROR nova.compute.manager raise self.value [ 748.986659] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 748.986659] env[59534]: ERROR nova.compute.manager updated_port = self._update_port( [ 748.986659] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 748.986659] env[59534]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 748.987398] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 748.987398] env[59534]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 748.987398] env[59534]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port db5dff77-a34d-4c6b-9be4-52911ee6ca15, please check neutron logs for more information. [ 748.987398] env[59534]: ERROR nova.compute.manager [ 748.987398] env[59534]: Traceback (most recent call last): [ 748.987398] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 748.987398] env[59534]: listener.cb(fileno) [ 748.987398] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 748.987398] env[59534]: result = function(*args, **kwargs) [ 748.987398] env[59534]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 748.987398] env[59534]: return func(*args, **kwargs) [ 748.987398] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 748.987398] env[59534]: raise e [ 748.987398] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 748.987398] env[59534]: nwinfo = self.network_api.allocate_for_instance( [ 748.987398] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 748.987398] env[59534]: created_port_ids = self._update_ports_for_instance( [ 748.987398] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 748.987398] env[59534]: with excutils.save_and_reraise_exception(): [ 748.987398] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 748.987398] env[59534]: self.force_reraise() [ 748.987398] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 748.987398] env[59534]: raise self.value [ 748.987398] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 748.987398] env[59534]: updated_port = self._update_port( [ 748.987398] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 748.987398] env[59534]: _ensure_no_port_binding_failure(port) [ 748.987398] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 748.987398] env[59534]: raise exception.PortBindingFailed(port_id=port['id']) [ 748.988212] env[59534]: nova.exception.PortBindingFailed: Binding failed for port db5dff77-a34d-4c6b-9be4-52911ee6ca15, please check neutron logs for more information. [ 748.988212] env[59534]: Removing descriptor: 19 [ 748.988212] env[59534]: ERROR nova.compute.manager [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port db5dff77-a34d-4c6b-9be4-52911ee6ca15, please check neutron logs for more information. [ 748.988212] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] Traceback (most recent call last): [ 748.988212] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 748.988212] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] yield resources [ 748.988212] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 748.988212] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] self.driver.spawn(context, instance, image_meta, [ 748.988212] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 748.988212] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] self._vmops.spawn(context, instance, image_meta, injected_files, [ 748.988212] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 748.988212] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] vm_ref = self.build_virtual_machine(instance, [ 748.991303] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 748.991303] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] vif_infos = vmwarevif.get_vif_info(self._session, [ 748.991303] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 748.991303] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] for vif in network_info: [ 748.991303] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 748.991303] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] return self._sync_wrapper(fn, *args, **kwargs) [ 748.991303] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 748.991303] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] self.wait() [ 748.991303] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 748.991303] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] self[:] = self._gt.wait() [ 748.991303] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 748.991303] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] return self._exit_event.wait() [ 748.991303] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 748.991303] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] result = hub.switch() [ 748.991800] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 748.991800] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] return self.greenlet.switch() [ 748.991800] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 748.991800] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] result = function(*args, **kwargs) [ 748.991800] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 748.991800] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] return func(*args, **kwargs) [ 748.991800] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 748.991800] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] raise e [ 748.991800] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 748.991800] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] nwinfo = self.network_api.allocate_for_instance( [ 748.991800] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 748.991800] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] created_port_ids = self._update_ports_for_instance( [ 748.991800] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 748.992164] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] with excutils.save_and_reraise_exception(): [ 748.992164] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 748.992164] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] self.force_reraise() [ 748.992164] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 748.992164] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] raise self.value [ 748.992164] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 748.992164] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] updated_port = self._update_port( [ 748.992164] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 748.992164] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] _ensure_no_port_binding_failure(port) [ 748.992164] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 748.992164] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] raise exception.PortBindingFailed(port_id=port['id']) [ 748.992164] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] nova.exception.PortBindingFailed: Binding failed for port db5dff77-a34d-4c6b-9be4-52911ee6ca15, please check neutron logs for more information. [ 748.992164] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] [ 748.992714] env[59534]: INFO nova.compute.manager [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] Terminating instance [ 748.992714] env[59534]: DEBUG oslo_concurrency.lockutils [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Acquiring lock "refresh_cache-ec3f2585-2dff-4bd8-97b9-ad9357835761" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 748.992714] env[59534]: DEBUG oslo_concurrency.lockutils [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Acquired lock "refresh_cache-ec3f2585-2dff-4bd8-97b9-ad9357835761" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 748.992714] env[59534]: DEBUG nova.network.neutron [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 749.051748] env[59534]: DEBUG nova.network.neutron [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 749.633927] env[59534]: DEBUG nova.network.neutron [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 749.650594] env[59534]: DEBUG oslo_concurrency.lockutils [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Releasing lock "refresh_cache-ec3f2585-2dff-4bd8-97b9-ad9357835761" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 749.651032] env[59534]: DEBUG nova.compute.manager [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] Start destroying the instance on the hypervisor. {{(pid=59534) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 749.651237] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] Destroying instance {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 749.656566] env[59534]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-5b15f4f2-66ca-499b-8b43-9d405c486d77 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 749.670597] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf3a7edc-e039-4f2f-a5e7-ee47061aa423 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 749.699354] env[59534]: WARNING nova.virt.vmwareapi.vmops [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ec3f2585-2dff-4bd8-97b9-ad9357835761 could not be found. [ 749.699650] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] Instance destroyed {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 749.700120] env[59534]: INFO nova.compute.manager [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] Took 0.05 seconds to destroy the instance on the hypervisor. [ 749.700426] env[59534]: DEBUG oslo.service.loopingcall [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59534) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 749.700968] env[59534]: DEBUG nova.compute.manager [-] [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 749.700968] env[59534]: DEBUG nova.network.neutron [-] [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 749.755153] env[59534]: DEBUG nova.network.neutron [-] [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 749.766527] env[59534]: DEBUG nova.network.neutron [-] [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 749.776649] env[59534]: INFO nova.compute.manager [-] [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] Took 0.08 seconds to deallocate network for instance. [ 749.780069] env[59534]: DEBUG nova.compute.claims [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] Aborting claim: {{(pid=59534) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 749.781525] env[59534]: DEBUG oslo_concurrency.lockutils [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 749.781869] env[59534]: DEBUG oslo_concurrency.lockutils [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 749.955098] env[59534]: DEBUG nova.network.neutron [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] Successfully created port: c2cc2e50-9814-4af3-8ecb-337708efc33f {{(pid=59534) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 749.967364] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d5023d6-878a-4728-ba46-652076804865 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 749.975634] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50c6ad81-45c3-4ed5-abd4-b83320efd501 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 750.007168] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c0d23fa-ce5d-42f6-9241-9e2d221f1b54 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 750.014379] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-261820df-5771-4fd9-a32a-865e64438034 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 750.028393] env[59534]: DEBUG nova.compute.provider_tree [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 750.041390] env[59534]: DEBUG nova.scheduler.client.report [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 750.056587] env[59534]: DEBUG oslo_concurrency.lockutils [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.275s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 750.057190] env[59534]: ERROR nova.compute.manager [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port db5dff77-a34d-4c6b-9be4-52911ee6ca15, please check neutron logs for more information. [ 750.057190] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] Traceback (most recent call last): [ 750.057190] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 750.057190] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] self.driver.spawn(context, instance, image_meta, [ 750.057190] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 750.057190] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] self._vmops.spawn(context, instance, image_meta, injected_files, [ 750.057190] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 750.057190] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] vm_ref = self.build_virtual_machine(instance, [ 750.057190] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 750.057190] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] vif_infos = vmwarevif.get_vif_info(self._session, [ 750.057190] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 750.057551] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] for vif in network_info: [ 750.057551] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 750.057551] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] return self._sync_wrapper(fn, *args, **kwargs) [ 750.057551] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 750.057551] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] self.wait() [ 750.057551] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 750.057551] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] self[:] = self._gt.wait() [ 750.057551] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 750.057551] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] return self._exit_event.wait() [ 750.057551] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 750.057551] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] result = hub.switch() [ 750.057551] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 750.057551] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] return self.greenlet.switch() [ 750.057551] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 750.057961] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] result = function(*args, **kwargs) [ 750.057961] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 750.057961] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] return func(*args, **kwargs) [ 750.057961] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 750.057961] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] raise e [ 750.057961] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 750.057961] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] nwinfo = self.network_api.allocate_for_instance( [ 750.057961] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 750.057961] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] created_port_ids = self._update_ports_for_instance( [ 750.057961] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 750.057961] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] with excutils.save_and_reraise_exception(): [ 750.057961] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 750.057961] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] self.force_reraise() [ 750.058399] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 750.058399] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] raise self.value [ 750.058399] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 750.058399] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] updated_port = self._update_port( [ 750.058399] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 750.058399] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] _ensure_no_port_binding_failure(port) [ 750.058399] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 750.058399] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] raise exception.PortBindingFailed(port_id=port['id']) [ 750.058399] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] nova.exception.PortBindingFailed: Binding failed for port db5dff77-a34d-4c6b-9be4-52911ee6ca15, please check neutron logs for more information. [ 750.058399] env[59534]: ERROR nova.compute.manager [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] [ 750.058399] env[59534]: DEBUG nova.compute.utils [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] Binding failed for port db5dff77-a34d-4c6b-9be4-52911ee6ca15, please check neutron logs for more information. {{(pid=59534) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 750.059825] env[59534]: DEBUG nova.compute.manager [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] Build of instance ec3f2585-2dff-4bd8-97b9-ad9357835761 was re-scheduled: Binding failed for port db5dff77-a34d-4c6b-9be4-52911ee6ca15, please check neutron logs for more information. {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 750.060336] env[59534]: DEBUG nova.compute.manager [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] Unplugging VIFs for instance {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 750.060587] env[59534]: DEBUG oslo_concurrency.lockutils [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Acquiring lock "refresh_cache-ec3f2585-2dff-4bd8-97b9-ad9357835761" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 750.060762] env[59534]: DEBUG oslo_concurrency.lockutils [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Acquired lock "refresh_cache-ec3f2585-2dff-4bd8-97b9-ad9357835761" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 750.060952] env[59534]: DEBUG nova.network.neutron [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 750.134332] env[59534]: DEBUG nova.network.neutron [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 750.854307] env[59534]: DEBUG nova.network.neutron [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 750.863277] env[59534]: DEBUG oslo_concurrency.lockutils [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Releasing lock "refresh_cache-ec3f2585-2dff-4bd8-97b9-ad9357835761" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 750.864306] env[59534]: DEBUG nova.compute.manager [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 750.864540] env[59534]: DEBUG nova.compute.manager [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 750.864776] env[59534]: DEBUG nova.network.neutron [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 750.935141] env[59534]: DEBUG nova.network.neutron [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 750.942742] env[59534]: DEBUG nova.network.neutron [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 750.952025] env[59534]: INFO nova.compute.manager [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: ec3f2585-2dff-4bd8-97b9-ad9357835761] Took 0.09 seconds to deallocate network for instance. [ 751.050115] env[59534]: INFO nova.scheduler.client.report [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Deleted allocations for instance ec3f2585-2dff-4bd8-97b9-ad9357835761 [ 751.067563] env[59534]: DEBUG oslo_concurrency.lockutils [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Lock "ec3f2585-2dff-4bd8-97b9-ad9357835761" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 8.118s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 753.698987] env[59534]: DEBUG nova.compute.manager [req-fba6e5f4-b93a-45e4-bca9-0dfd563ac260 req-315789e1-f63f-4399-be59-3bdb3f427580 service nova] [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] Received event network-changed-210637bf-150e-4fa7-867a-baba46769887 {{(pid=59534) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 753.699287] env[59534]: DEBUG nova.compute.manager [req-fba6e5f4-b93a-45e4-bca9-0dfd563ac260 req-315789e1-f63f-4399-be59-3bdb3f427580 service nova] [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] Refreshing instance network info cache due to event network-changed-210637bf-150e-4fa7-867a-baba46769887. {{(pid=59534) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 753.699626] env[59534]: DEBUG oslo_concurrency.lockutils [req-fba6e5f4-b93a-45e4-bca9-0dfd563ac260 req-315789e1-f63f-4399-be59-3bdb3f427580 service nova] Acquiring lock "refresh_cache-3b30863b-8dc5-43e1-a222-ddcc6945af5f" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 753.699816] env[59534]: DEBUG oslo_concurrency.lockutils [req-fba6e5f4-b93a-45e4-bca9-0dfd563ac260 req-315789e1-f63f-4399-be59-3bdb3f427580 service nova] Acquired lock "refresh_cache-3b30863b-8dc5-43e1-a222-ddcc6945af5f" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 753.699998] env[59534]: DEBUG nova.network.neutron [req-fba6e5f4-b93a-45e4-bca9-0dfd563ac260 req-315789e1-f63f-4399-be59-3bdb3f427580 service nova] [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] Refreshing network info cache for port 210637bf-150e-4fa7-867a-baba46769887 {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 753.860750] env[59534]: DEBUG nova.network.neutron [req-fba6e5f4-b93a-45e4-bca9-0dfd563ac260 req-315789e1-f63f-4399-be59-3bdb3f427580 service nova] [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 754.277648] env[59534]: DEBUG nova.network.neutron [req-fba6e5f4-b93a-45e4-bca9-0dfd563ac260 req-315789e1-f63f-4399-be59-3bdb3f427580 service nova] [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 754.296585] env[59534]: DEBUG oslo_concurrency.lockutils [req-fba6e5f4-b93a-45e4-bca9-0dfd563ac260 req-315789e1-f63f-4399-be59-3bdb3f427580 service nova] Releasing lock "refresh_cache-3b30863b-8dc5-43e1-a222-ddcc6945af5f" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 754.489802] env[59534]: ERROR nova.compute.manager [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 210637bf-150e-4fa7-867a-baba46769887, please check neutron logs for more information. [ 754.489802] env[59534]: ERROR nova.compute.manager Traceback (most recent call last): [ 754.489802] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 754.489802] env[59534]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 754.489802] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 754.489802] env[59534]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 754.489802] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 754.489802] env[59534]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 754.489802] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 754.489802] env[59534]: ERROR nova.compute.manager self.force_reraise() [ 754.489802] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 754.489802] env[59534]: ERROR nova.compute.manager raise self.value [ 754.489802] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 754.489802] env[59534]: ERROR nova.compute.manager updated_port = self._update_port( [ 754.489802] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 754.489802] env[59534]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 754.490270] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 754.490270] env[59534]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 754.490270] env[59534]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 210637bf-150e-4fa7-867a-baba46769887, please check neutron logs for more information. [ 754.490270] env[59534]: ERROR nova.compute.manager [ 754.490270] env[59534]: Traceback (most recent call last): [ 754.490270] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 754.490270] env[59534]: listener.cb(fileno) [ 754.490270] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 754.490270] env[59534]: result = function(*args, **kwargs) [ 754.490270] env[59534]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 754.490270] env[59534]: return func(*args, **kwargs) [ 754.490270] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 754.490270] env[59534]: raise e [ 754.490270] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 754.490270] env[59534]: nwinfo = self.network_api.allocate_for_instance( [ 754.490270] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 754.490270] env[59534]: created_port_ids = self._update_ports_for_instance( [ 754.490270] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 754.490270] env[59534]: with excutils.save_and_reraise_exception(): [ 754.490270] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 754.490270] env[59534]: self.force_reraise() [ 754.490270] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 754.490270] env[59534]: raise self.value [ 754.490270] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 754.490270] env[59534]: updated_port = self._update_port( [ 754.490270] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 754.490270] env[59534]: _ensure_no_port_binding_failure(port) [ 754.490270] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 754.490270] env[59534]: raise exception.PortBindingFailed(port_id=port['id']) [ 754.491062] env[59534]: nova.exception.PortBindingFailed: Binding failed for port 210637bf-150e-4fa7-867a-baba46769887, please check neutron logs for more information. [ 754.491062] env[59534]: Removing descriptor: 21 [ 754.491062] env[59534]: ERROR nova.compute.manager [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 210637bf-150e-4fa7-867a-baba46769887, please check neutron logs for more information. [ 754.491062] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] Traceback (most recent call last): [ 754.491062] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 754.491062] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] yield resources [ 754.491062] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 754.491062] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] self.driver.spawn(context, instance, image_meta, [ 754.491062] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 754.491062] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 754.491062] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 754.491062] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] vm_ref = self.build_virtual_machine(instance, [ 754.491366] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 754.491366] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] vif_infos = vmwarevif.get_vif_info(self._session, [ 754.491366] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 754.491366] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] for vif in network_info: [ 754.491366] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 754.491366] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] return self._sync_wrapper(fn, *args, **kwargs) [ 754.491366] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 754.491366] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] self.wait() [ 754.491366] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 754.491366] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] self[:] = self._gt.wait() [ 754.491366] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 754.491366] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] return self._exit_event.wait() [ 754.491366] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 754.491366] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] result = hub.switch() [ 754.493390] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 754.493390] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] return self.greenlet.switch() [ 754.493390] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 754.493390] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] result = function(*args, **kwargs) [ 754.493390] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 754.493390] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] return func(*args, **kwargs) [ 754.493390] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 754.493390] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] raise e [ 754.493390] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 754.493390] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] nwinfo = self.network_api.allocate_for_instance( [ 754.493390] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 754.493390] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] created_port_ids = self._update_ports_for_instance( [ 754.493390] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 754.493866] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] with excutils.save_and_reraise_exception(): [ 754.493866] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 754.493866] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] self.force_reraise() [ 754.493866] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 754.493866] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] raise self.value [ 754.493866] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 754.493866] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] updated_port = self._update_port( [ 754.493866] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 754.493866] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] _ensure_no_port_binding_failure(port) [ 754.493866] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 754.493866] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] raise exception.PortBindingFailed(port_id=port['id']) [ 754.493866] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] nova.exception.PortBindingFailed: Binding failed for port 210637bf-150e-4fa7-867a-baba46769887, please check neutron logs for more information. [ 754.493866] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] [ 754.494333] env[59534]: INFO nova.compute.manager [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] Terminating instance [ 754.496348] env[59534]: DEBUG oslo_concurrency.lockutils [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Acquiring lock "refresh_cache-3b30863b-8dc5-43e1-a222-ddcc6945af5f" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 754.496348] env[59534]: DEBUG oslo_concurrency.lockutils [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Acquired lock "refresh_cache-3b30863b-8dc5-43e1-a222-ddcc6945af5f" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 754.496348] env[59534]: DEBUG nova.network.neutron [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 754.676446] env[59534]: DEBUG nova.network.neutron [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 754.849142] env[59534]: ERROR nova.compute.manager [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port e766518a-f477-4b12-be50-d6060b46af36, please check neutron logs for more information. [ 754.849142] env[59534]: ERROR nova.compute.manager Traceback (most recent call last): [ 754.849142] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 754.849142] env[59534]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 754.849142] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 754.849142] env[59534]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 754.849142] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 754.849142] env[59534]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 754.849142] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 754.849142] env[59534]: ERROR nova.compute.manager self.force_reraise() [ 754.849142] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 754.849142] env[59534]: ERROR nova.compute.manager raise self.value [ 754.849142] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 754.849142] env[59534]: ERROR nova.compute.manager updated_port = self._update_port( [ 754.849142] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 754.849142] env[59534]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 754.849791] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 754.849791] env[59534]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 754.849791] env[59534]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port e766518a-f477-4b12-be50-d6060b46af36, please check neutron logs for more information. [ 754.849791] env[59534]: ERROR nova.compute.manager [ 754.849791] env[59534]: Traceback (most recent call last): [ 754.849791] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 754.849791] env[59534]: listener.cb(fileno) [ 754.849791] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 754.849791] env[59534]: result = function(*args, **kwargs) [ 754.849791] env[59534]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 754.849791] env[59534]: return func(*args, **kwargs) [ 754.849791] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 754.849791] env[59534]: raise e [ 754.849791] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 754.849791] env[59534]: nwinfo = self.network_api.allocate_for_instance( [ 754.849791] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 754.849791] env[59534]: created_port_ids = self._update_ports_for_instance( [ 754.849791] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 754.849791] env[59534]: with excutils.save_and_reraise_exception(): [ 754.849791] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 754.849791] env[59534]: self.force_reraise() [ 754.849791] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 754.849791] env[59534]: raise self.value [ 754.849791] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 754.849791] env[59534]: updated_port = self._update_port( [ 754.849791] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 754.849791] env[59534]: _ensure_no_port_binding_failure(port) [ 754.849791] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 754.849791] env[59534]: raise exception.PortBindingFailed(port_id=port['id']) [ 754.850506] env[59534]: nova.exception.PortBindingFailed: Binding failed for port e766518a-f477-4b12-be50-d6060b46af36, please check neutron logs for more information. [ 754.850506] env[59534]: Removing descriptor: 18 [ 754.854167] env[59534]: ERROR nova.compute.manager [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] [instance: 441810a5-3977-4c39-9c4f-3157678196a2] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port e766518a-f477-4b12-be50-d6060b46af36, please check neutron logs for more information. [ 754.854167] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] Traceback (most recent call last): [ 754.854167] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 754.854167] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] yield resources [ 754.854167] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 754.854167] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] self.driver.spawn(context, instance, image_meta, [ 754.854167] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 754.854167] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 754.854167] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 754.854167] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] vm_ref = self.build_virtual_machine(instance, [ 754.854167] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 754.854485] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] vif_infos = vmwarevif.get_vif_info(self._session, [ 754.854485] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 754.854485] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] for vif in network_info: [ 754.854485] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 754.854485] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] return self._sync_wrapper(fn, *args, **kwargs) [ 754.854485] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 754.854485] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] self.wait() [ 754.854485] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 754.854485] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] self[:] = self._gt.wait() [ 754.854485] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 754.854485] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] return self._exit_event.wait() [ 754.854485] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 754.854485] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] result = hub.switch() [ 754.854485] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 754.854946] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] return self.greenlet.switch() [ 754.854946] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 754.854946] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] result = function(*args, **kwargs) [ 754.854946] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 754.854946] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] return func(*args, **kwargs) [ 754.854946] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 754.854946] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] raise e [ 754.854946] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 754.854946] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] nwinfo = self.network_api.allocate_for_instance( [ 754.854946] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 754.854946] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] created_port_ids = self._update_ports_for_instance( [ 754.854946] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 754.854946] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] with excutils.save_and_reraise_exception(): [ 754.855282] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 754.855282] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] self.force_reraise() [ 754.855282] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 754.855282] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] raise self.value [ 754.855282] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 754.855282] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] updated_port = self._update_port( [ 754.855282] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 754.855282] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] _ensure_no_port_binding_failure(port) [ 754.855282] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 754.855282] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] raise exception.PortBindingFailed(port_id=port['id']) [ 754.855282] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] nova.exception.PortBindingFailed: Binding failed for port e766518a-f477-4b12-be50-d6060b46af36, please check neutron logs for more information. [ 754.855282] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] [ 754.855610] env[59534]: INFO nova.compute.manager [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] [instance: 441810a5-3977-4c39-9c4f-3157678196a2] Terminating instance [ 754.855610] env[59534]: DEBUG oslo_concurrency.lockutils [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] Acquiring lock "refresh_cache-441810a5-3977-4c39-9c4f-3157678196a2" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 754.855610] env[59534]: DEBUG oslo_concurrency.lockutils [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] Acquired lock "refresh_cache-441810a5-3977-4c39-9c4f-3157678196a2" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 754.855610] env[59534]: DEBUG nova.network.neutron [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] [instance: 441810a5-3977-4c39-9c4f-3157678196a2] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 754.972759] env[59534]: DEBUG nova.network.neutron [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] [instance: 441810a5-3977-4c39-9c4f-3157678196a2] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 755.558669] env[59534]: DEBUG nova.network.neutron [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 755.571846] env[59534]: DEBUG oslo_concurrency.lockutils [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Releasing lock "refresh_cache-3b30863b-8dc5-43e1-a222-ddcc6945af5f" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 755.572263] env[59534]: DEBUG nova.compute.manager [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] Start destroying the instance on the hypervisor. {{(pid=59534) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 755.572454] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] Destroying instance {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 755.572977] env[59534]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c4fc4e2e-67cd-42ad-be02-025f4eb6cf6b {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 755.587258] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17659cf0-38de-466f-a172-595d635c93a4 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 755.615842] env[59534]: WARNING nova.virt.vmwareapi.vmops [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 3b30863b-8dc5-43e1-a222-ddcc6945af5f could not be found. [ 755.616096] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] Instance destroyed {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 755.616274] env[59534]: INFO nova.compute.manager [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] Took 0.04 seconds to destroy the instance on the hypervisor. [ 755.616512] env[59534]: DEBUG oslo.service.loopingcall [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59534) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 755.616760] env[59534]: DEBUG nova.compute.manager [-] [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 755.616935] env[59534]: DEBUG nova.network.neutron [-] [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 755.797660] env[59534]: DEBUG nova.network.neutron [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] [instance: 441810a5-3977-4c39-9c4f-3157678196a2] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 755.810349] env[59534]: DEBUG oslo_concurrency.lockutils [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] Releasing lock "refresh_cache-441810a5-3977-4c39-9c4f-3157678196a2" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 755.810450] env[59534]: DEBUG nova.compute.manager [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] [instance: 441810a5-3977-4c39-9c4f-3157678196a2] Start destroying the instance on the hypervisor. {{(pid=59534) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 755.810587] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] [instance: 441810a5-3977-4c39-9c4f-3157678196a2] Destroying instance {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 755.811533] env[59534]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-8d3b0893-4d68-4170-ac7f-99f33b00644d {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 755.825207] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-efc01fdb-4a4c-4975-b8bf-428d75c5989b {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 755.855607] env[59534]: WARNING nova.virt.vmwareapi.vmops [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] [instance: 441810a5-3977-4c39-9c4f-3157678196a2] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 441810a5-3977-4c39-9c4f-3157678196a2 could not be found. [ 755.855915] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] [instance: 441810a5-3977-4c39-9c4f-3157678196a2] Instance destroyed {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 755.856290] env[59534]: INFO nova.compute.manager [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] [instance: 441810a5-3977-4c39-9c4f-3157678196a2] Took 0.05 seconds to destroy the instance on the hypervisor. [ 755.856444] env[59534]: DEBUG oslo.service.loopingcall [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59534) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 755.857295] env[59534]: DEBUG nova.compute.manager [-] [instance: 441810a5-3977-4c39-9c4f-3157678196a2] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 755.857496] env[59534]: DEBUG nova.network.neutron [-] [instance: 441810a5-3977-4c39-9c4f-3157678196a2] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 755.936925] env[59534]: DEBUG nova.network.neutron [-] [instance: 441810a5-3977-4c39-9c4f-3157678196a2] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 755.948169] env[59534]: DEBUG nova.network.neutron [-] [instance: 441810a5-3977-4c39-9c4f-3157678196a2] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 755.965110] env[59534]: INFO nova.compute.manager [-] [instance: 441810a5-3977-4c39-9c4f-3157678196a2] Took 0.11 seconds to deallocate network for instance. [ 755.966752] env[59534]: DEBUG nova.compute.claims [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] [instance: 441810a5-3977-4c39-9c4f-3157678196a2] Aborting claim: {{(pid=59534) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 755.966950] env[59534]: DEBUG oslo_concurrency.lockutils [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 755.967652] env[59534]: DEBUG oslo_concurrency.lockutils [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 755.983401] env[59534]: DEBUG nova.network.neutron [-] [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 756.056320] env[59534]: ERROR nova.compute.manager [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port df189034-7960-4099-b3ee-4cea8604709a, please check neutron logs for more information. [ 756.056320] env[59534]: ERROR nova.compute.manager Traceback (most recent call last): [ 756.056320] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 756.056320] env[59534]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 756.056320] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 756.056320] env[59534]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 756.056320] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 756.056320] env[59534]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 756.056320] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 756.056320] env[59534]: ERROR nova.compute.manager self.force_reraise() [ 756.056320] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 756.056320] env[59534]: ERROR nova.compute.manager raise self.value [ 756.056320] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 756.056320] env[59534]: ERROR nova.compute.manager updated_port = self._update_port( [ 756.056320] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 756.056320] env[59534]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 756.056854] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 756.056854] env[59534]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 756.056854] env[59534]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port df189034-7960-4099-b3ee-4cea8604709a, please check neutron logs for more information. [ 756.056854] env[59534]: ERROR nova.compute.manager [ 756.056854] env[59534]: Traceback (most recent call last): [ 756.056854] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 756.056854] env[59534]: listener.cb(fileno) [ 756.056854] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 756.056854] env[59534]: result = function(*args, **kwargs) [ 756.056854] env[59534]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 756.056854] env[59534]: return func(*args, **kwargs) [ 756.056854] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 756.056854] env[59534]: raise e [ 756.056854] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 756.056854] env[59534]: nwinfo = self.network_api.allocate_for_instance( [ 756.056854] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 756.056854] env[59534]: created_port_ids = self._update_ports_for_instance( [ 756.056854] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 756.056854] env[59534]: with excutils.save_and_reraise_exception(): [ 756.056854] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 756.056854] env[59534]: self.force_reraise() [ 756.056854] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 756.056854] env[59534]: raise self.value [ 756.056854] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 756.056854] env[59534]: updated_port = self._update_port( [ 756.056854] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 756.056854] env[59534]: _ensure_no_port_binding_failure(port) [ 756.056854] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 756.056854] env[59534]: raise exception.PortBindingFailed(port_id=port['id']) [ 756.057691] env[59534]: nova.exception.PortBindingFailed: Binding failed for port df189034-7960-4099-b3ee-4cea8604709a, please check neutron logs for more information. [ 756.057691] env[59534]: Removing descriptor: 17 [ 756.057691] env[59534]: ERROR nova.compute.manager [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port df189034-7960-4099-b3ee-4cea8604709a, please check neutron logs for more information. [ 756.057691] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] Traceback (most recent call last): [ 756.057691] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 756.057691] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] yield resources [ 756.057691] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 756.057691] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] self.driver.spawn(context, instance, image_meta, [ 756.057691] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 756.057691] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] self._vmops.spawn(context, instance, image_meta, injected_files, [ 756.057691] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 756.057691] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] vm_ref = self.build_virtual_machine(instance, [ 756.058075] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 756.058075] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] vif_infos = vmwarevif.get_vif_info(self._session, [ 756.058075] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 756.058075] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] for vif in network_info: [ 756.058075] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 756.058075] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] return self._sync_wrapper(fn, *args, **kwargs) [ 756.058075] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 756.058075] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] self.wait() [ 756.058075] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 756.058075] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] self[:] = self._gt.wait() [ 756.058075] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 756.058075] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] return self._exit_event.wait() [ 756.058075] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 756.058075] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] result = hub.switch() [ 756.058465] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 756.058465] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] return self.greenlet.switch() [ 756.058465] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 756.058465] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] result = function(*args, **kwargs) [ 756.058465] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 756.058465] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] return func(*args, **kwargs) [ 756.058465] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 756.058465] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] raise e [ 756.058465] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 756.058465] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] nwinfo = self.network_api.allocate_for_instance( [ 756.058465] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 756.058465] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] created_port_ids = self._update_ports_for_instance( [ 756.058465] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 756.058785] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] with excutils.save_and_reraise_exception(): [ 756.058785] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 756.058785] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] self.force_reraise() [ 756.058785] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 756.058785] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] raise self.value [ 756.058785] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 756.058785] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] updated_port = self._update_port( [ 756.058785] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 756.058785] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] _ensure_no_port_binding_failure(port) [ 756.058785] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 756.058785] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] raise exception.PortBindingFailed(port_id=port['id']) [ 756.058785] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] nova.exception.PortBindingFailed: Binding failed for port df189034-7960-4099-b3ee-4cea8604709a, please check neutron logs for more information. [ 756.058785] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] [ 756.059170] env[59534]: INFO nova.compute.manager [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] Terminating instance [ 756.059589] env[59534]: DEBUG oslo_concurrency.lockutils [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Acquiring lock "refresh_cache-eaeb9be3-b904-4e62-8ed3-301829c6a2ab" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 756.059659] env[59534]: DEBUG oslo_concurrency.lockutils [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Acquired lock "refresh_cache-eaeb9be3-b904-4e62-8ed3-301829c6a2ab" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 756.059832] env[59534]: DEBUG nova.network.neutron [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 756.128407] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76f7a722-ef17-422f-8eac-3ee0de686bfa {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.138805] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a623932a-9418-48f6-8f0d-145ca812b65d {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.174279] env[59534]: DEBUG nova.network.neutron [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 756.176523] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29099941-54d3-4a4f-8be4-8e678e3b1433 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.189747] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da12645e-c959-4726-9606-0ffa89003ca9 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.203597] env[59534]: DEBUG nova.compute.provider_tree [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 756.215798] env[59534]: DEBUG nova.scheduler.client.report [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 756.232553] env[59534]: DEBUG oslo_concurrency.lockutils [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.265s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 756.233315] env[59534]: ERROR nova.compute.manager [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] [instance: 441810a5-3977-4c39-9c4f-3157678196a2] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port e766518a-f477-4b12-be50-d6060b46af36, please check neutron logs for more information. [ 756.233315] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] Traceback (most recent call last): [ 756.233315] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 756.233315] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] self.driver.spawn(context, instance, image_meta, [ 756.233315] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 756.233315] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 756.233315] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 756.233315] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] vm_ref = self.build_virtual_machine(instance, [ 756.233315] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 756.233315] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] vif_infos = vmwarevif.get_vif_info(self._session, [ 756.233315] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 756.234043] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] for vif in network_info: [ 756.234043] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 756.234043] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] return self._sync_wrapper(fn, *args, **kwargs) [ 756.234043] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 756.234043] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] self.wait() [ 756.234043] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 756.234043] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] self[:] = self._gt.wait() [ 756.234043] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 756.234043] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] return self._exit_event.wait() [ 756.234043] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 756.234043] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] result = hub.switch() [ 756.234043] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 756.234043] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] return self.greenlet.switch() [ 756.234043] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 756.235108] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] result = function(*args, **kwargs) [ 756.235108] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 756.235108] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] return func(*args, **kwargs) [ 756.235108] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 756.235108] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] raise e [ 756.235108] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 756.235108] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] nwinfo = self.network_api.allocate_for_instance( [ 756.235108] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 756.235108] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] created_port_ids = self._update_ports_for_instance( [ 756.235108] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 756.235108] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] with excutils.save_and_reraise_exception(): [ 756.235108] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 756.235108] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] self.force_reraise() [ 756.235489] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 756.235489] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] raise self.value [ 756.235489] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 756.235489] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] updated_port = self._update_port( [ 756.235489] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 756.235489] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] _ensure_no_port_binding_failure(port) [ 756.235489] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 756.235489] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] raise exception.PortBindingFailed(port_id=port['id']) [ 756.235489] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] nova.exception.PortBindingFailed: Binding failed for port e766518a-f477-4b12-be50-d6060b46af36, please check neutron logs for more information. [ 756.235489] env[59534]: ERROR nova.compute.manager [instance: 441810a5-3977-4c39-9c4f-3157678196a2] [ 756.235489] env[59534]: DEBUG nova.compute.utils [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] [instance: 441810a5-3977-4c39-9c4f-3157678196a2] Binding failed for port e766518a-f477-4b12-be50-d6060b46af36, please check neutron logs for more information. {{(pid=59534) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 756.235850] env[59534]: DEBUG nova.compute.manager [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] [instance: 441810a5-3977-4c39-9c4f-3157678196a2] Build of instance 441810a5-3977-4c39-9c4f-3157678196a2 was re-scheduled: Binding failed for port e766518a-f477-4b12-be50-d6060b46af36, please check neutron logs for more information. {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 756.236062] env[59534]: DEBUG nova.compute.manager [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] [instance: 441810a5-3977-4c39-9c4f-3157678196a2] Unplugging VIFs for instance {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 756.236209] env[59534]: DEBUG oslo_concurrency.lockutils [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] Acquiring lock "refresh_cache-441810a5-3977-4c39-9c4f-3157678196a2" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 756.236354] env[59534]: DEBUG oslo_concurrency.lockutils [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] Acquired lock "refresh_cache-441810a5-3977-4c39-9c4f-3157678196a2" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 756.236510] env[59534]: DEBUG nova.network.neutron [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] [instance: 441810a5-3977-4c39-9c4f-3157678196a2] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 756.505136] env[59534]: DEBUG nova.network.neutron [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] [instance: 441810a5-3977-4c39-9c4f-3157678196a2] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 757.057118] env[59534]: DEBUG nova.network.neutron [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 757.071325] env[59534]: DEBUG oslo_concurrency.lockutils [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Releasing lock "refresh_cache-eaeb9be3-b904-4e62-8ed3-301829c6a2ab" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 757.075027] env[59534]: DEBUG nova.compute.manager [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] Start destroying the instance on the hypervisor. {{(pid=59534) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 757.075027] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] Destroying instance {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 757.075027] env[59534]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-38bb451e-f591-45f6-9b01-e8a79a195145 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 757.086146] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53ce56a7-6829-4cda-8f1b-f4d2aae2bf4a {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 757.114922] env[59534]: WARNING nova.virt.vmwareapi.vmops [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance eaeb9be3-b904-4e62-8ed3-301829c6a2ab could not be found. [ 757.116863] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] Instance destroyed {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 757.116863] env[59534]: INFO nova.compute.manager [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] Took 0.04 seconds to destroy the instance on the hypervisor. [ 757.116863] env[59534]: DEBUG oslo.service.loopingcall [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59534) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 757.116863] env[59534]: DEBUG nova.compute.manager [-] [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 757.116863] env[59534]: DEBUG nova.network.neutron [-] [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 757.143965] env[59534]: DEBUG nova.network.neutron [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] [instance: 441810a5-3977-4c39-9c4f-3157678196a2] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 757.154348] env[59534]: DEBUG oslo_concurrency.lockutils [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] Releasing lock "refresh_cache-441810a5-3977-4c39-9c4f-3157678196a2" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 757.154348] env[59534]: DEBUG nova.compute.manager [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 757.154688] env[59534]: DEBUG nova.compute.manager [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] [instance: 441810a5-3977-4c39-9c4f-3157678196a2] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 757.154893] env[59534]: DEBUG nova.network.neutron [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] [instance: 441810a5-3977-4c39-9c4f-3157678196a2] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 757.204755] env[59534]: DEBUG nova.network.neutron [-] [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 757.212907] env[59534]: DEBUG nova.network.neutron [-] [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 757.225021] env[59534]: INFO nova.compute.manager [-] [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] Took 0.11 seconds to deallocate network for instance. [ 757.227197] env[59534]: DEBUG nova.compute.claims [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] Aborting claim: {{(pid=59534) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 757.227358] env[59534]: DEBUG oslo_concurrency.lockutils [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 757.227561] env[59534]: DEBUG oslo_concurrency.lockutils [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 757.260258] env[59534]: DEBUG nova.network.neutron [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] [instance: 441810a5-3977-4c39-9c4f-3157678196a2] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 757.270672] env[59534]: DEBUG nova.network.neutron [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] [instance: 441810a5-3977-4c39-9c4f-3157678196a2] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 757.279888] env[59534]: INFO nova.compute.manager [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] [instance: 441810a5-3977-4c39-9c4f-3157678196a2] Took 0.12 seconds to deallocate network for instance. [ 757.388545] env[59534]: INFO nova.scheduler.client.report [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] Deleted allocations for instance 441810a5-3977-4c39-9c4f-3157678196a2 [ 757.399252] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9bc84f85-6a7a-46e4-ad87-d7aeb005fee5 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 757.409731] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b9dc59a-39ac-45ed-931a-55ebe1c111de {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 757.444668] env[59534]: DEBUG oslo_concurrency.lockutils [None req-a44ad4a9-9c1e-4381-bac3-dfd510db967e tempest-ImagesOneServerTestJSON-572550516 tempest-ImagesOneServerTestJSON-572550516-project-member] Lock "441810a5-3977-4c39-9c4f-3157678196a2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 15.519s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 757.445373] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41cdd358-a755-437e-a795-8b8cff95aa3b {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 757.454124] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25811543-214e-48b0-bd2f-fb0abd9141f4 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 757.469016] env[59534]: DEBUG nova.compute.provider_tree [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 757.479047] env[59534]: DEBUG nova.scheduler.client.report [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 757.495266] env[59534]: DEBUG oslo_concurrency.lockutils [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.268s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 757.495956] env[59534]: ERROR nova.compute.manager [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port df189034-7960-4099-b3ee-4cea8604709a, please check neutron logs for more information. [ 757.495956] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] Traceback (most recent call last): [ 757.495956] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 757.495956] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] self.driver.spawn(context, instance, image_meta, [ 757.495956] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 757.495956] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] self._vmops.spawn(context, instance, image_meta, injected_files, [ 757.495956] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 757.495956] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] vm_ref = self.build_virtual_machine(instance, [ 757.495956] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 757.495956] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] vif_infos = vmwarevif.get_vif_info(self._session, [ 757.495956] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 757.496373] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] for vif in network_info: [ 757.496373] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 757.496373] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] return self._sync_wrapper(fn, *args, **kwargs) [ 757.496373] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 757.496373] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] self.wait() [ 757.496373] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 757.496373] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] self[:] = self._gt.wait() [ 757.496373] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 757.496373] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] return self._exit_event.wait() [ 757.496373] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 757.496373] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] result = hub.switch() [ 757.496373] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 757.496373] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] return self.greenlet.switch() [ 757.496373] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 757.496753] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] result = function(*args, **kwargs) [ 757.496753] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 757.496753] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] return func(*args, **kwargs) [ 757.496753] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 757.496753] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] raise e [ 757.496753] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 757.496753] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] nwinfo = self.network_api.allocate_for_instance( [ 757.496753] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 757.496753] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] created_port_ids = self._update_ports_for_instance( [ 757.496753] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 757.496753] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] with excutils.save_and_reraise_exception(): [ 757.496753] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 757.496753] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] self.force_reraise() [ 757.497146] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 757.497146] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] raise self.value [ 757.497146] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 757.497146] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] updated_port = self._update_port( [ 757.497146] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 757.497146] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] _ensure_no_port_binding_failure(port) [ 757.497146] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 757.497146] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] raise exception.PortBindingFailed(port_id=port['id']) [ 757.497146] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] nova.exception.PortBindingFailed: Binding failed for port df189034-7960-4099-b3ee-4cea8604709a, please check neutron logs for more information. [ 757.497146] env[59534]: ERROR nova.compute.manager [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] [ 757.497146] env[59534]: DEBUG nova.compute.utils [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] Binding failed for port df189034-7960-4099-b3ee-4cea8604709a, please check neutron logs for more information. {{(pid=59534) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 757.498908] env[59534]: DEBUG nova.compute.manager [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] Build of instance eaeb9be3-b904-4e62-8ed3-301829c6a2ab was re-scheduled: Binding failed for port df189034-7960-4099-b3ee-4cea8604709a, please check neutron logs for more information. {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 757.499446] env[59534]: DEBUG nova.compute.manager [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] Unplugging VIFs for instance {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 757.500026] env[59534]: DEBUG oslo_concurrency.lockutils [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Acquiring lock "refresh_cache-eaeb9be3-b904-4e62-8ed3-301829c6a2ab" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 757.500026] env[59534]: DEBUG oslo_concurrency.lockutils [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Acquired lock "refresh_cache-eaeb9be3-b904-4e62-8ed3-301829c6a2ab" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 757.500026] env[59534]: DEBUG nova.network.neutron [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 757.576948] env[59534]: DEBUG nova.network.neutron [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 757.809279] env[59534]: DEBUG nova.network.neutron [-] [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 757.826934] env[59534]: INFO nova.compute.manager [-] [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] Took 2.21 seconds to deallocate network for instance. [ 757.829216] env[59534]: DEBUG nova.compute.claims [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] Aborting claim: {{(pid=59534) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 757.832356] env[59534]: DEBUG oslo_concurrency.lockutils [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 757.832356] env[59534]: DEBUG oslo_concurrency.lockutils [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 757.972061] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4262d3a-5898-4f8e-a20e-7b5f0316c256 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 757.983707] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13563619-f5f1-4968-b836-74c2c0e51a3d {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 758.020235] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83dec407-fe2a-4803-9889-898146e54eeb {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 758.031249] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d86d94c7-91e6-46b9-8d48-865f953a45b6 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 758.048902] env[59534]: DEBUG nova.compute.provider_tree [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 758.057827] env[59534]: DEBUG nova.scheduler.client.report [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 758.073566] env[59534]: DEBUG oslo_concurrency.lockutils [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.244s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 758.074223] env[59534]: ERROR nova.compute.manager [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 210637bf-150e-4fa7-867a-baba46769887, please check neutron logs for more information. [ 758.074223] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] Traceback (most recent call last): [ 758.074223] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 758.074223] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] self.driver.spawn(context, instance, image_meta, [ 758.074223] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 758.074223] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 758.074223] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 758.074223] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] vm_ref = self.build_virtual_machine(instance, [ 758.074223] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 758.074223] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] vif_infos = vmwarevif.get_vif_info(self._session, [ 758.074223] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 758.074745] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] for vif in network_info: [ 758.074745] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 758.074745] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] return self._sync_wrapper(fn, *args, **kwargs) [ 758.074745] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 758.074745] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] self.wait() [ 758.074745] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 758.074745] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] self[:] = self._gt.wait() [ 758.074745] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 758.074745] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] return self._exit_event.wait() [ 758.074745] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 758.074745] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] result = hub.switch() [ 758.074745] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 758.074745] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] return self.greenlet.switch() [ 758.074745] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 758.075361] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] result = function(*args, **kwargs) [ 758.075361] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 758.075361] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] return func(*args, **kwargs) [ 758.075361] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 758.075361] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] raise e [ 758.075361] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 758.075361] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] nwinfo = self.network_api.allocate_for_instance( [ 758.075361] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 758.075361] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] created_port_ids = self._update_ports_for_instance( [ 758.075361] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 758.075361] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] with excutils.save_and_reraise_exception(): [ 758.075361] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 758.075361] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] self.force_reraise() [ 758.075941] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 758.075941] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] raise self.value [ 758.075941] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 758.075941] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] updated_port = self._update_port( [ 758.075941] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 758.075941] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] _ensure_no_port_binding_failure(port) [ 758.075941] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 758.075941] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] raise exception.PortBindingFailed(port_id=port['id']) [ 758.075941] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] nova.exception.PortBindingFailed: Binding failed for port 210637bf-150e-4fa7-867a-baba46769887, please check neutron logs for more information. [ 758.075941] env[59534]: ERROR nova.compute.manager [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] [ 758.075941] env[59534]: DEBUG nova.compute.utils [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] Binding failed for port 210637bf-150e-4fa7-867a-baba46769887, please check neutron logs for more information. {{(pid=59534) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 758.076666] env[59534]: DEBUG nova.compute.manager [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] Build of instance 3b30863b-8dc5-43e1-a222-ddcc6945af5f was re-scheduled: Binding failed for port 210637bf-150e-4fa7-867a-baba46769887, please check neutron logs for more information. {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 758.077124] env[59534]: DEBUG nova.compute.manager [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] Unplugging VIFs for instance {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 758.078255] env[59534]: DEBUG oslo_concurrency.lockutils [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Acquiring lock "refresh_cache-3b30863b-8dc5-43e1-a222-ddcc6945af5f" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 758.078255] env[59534]: DEBUG oslo_concurrency.lockutils [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Acquired lock "refresh_cache-3b30863b-8dc5-43e1-a222-ddcc6945af5f" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 758.078255] env[59534]: DEBUG nova.network.neutron [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 758.151088] env[59534]: DEBUG nova.network.neutron [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 758.177993] env[59534]: DEBUG nova.network.neutron [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 758.190159] env[59534]: DEBUG oslo_concurrency.lockutils [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Releasing lock "refresh_cache-eaeb9be3-b904-4e62-8ed3-301829c6a2ab" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 758.190464] env[59534]: DEBUG nova.compute.manager [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 758.190652] env[59534]: DEBUG nova.compute.manager [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 758.190916] env[59534]: DEBUG nova.network.neutron [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 758.288371] env[59534]: DEBUG nova.network.neutron [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 758.296551] env[59534]: DEBUG nova.network.neutron [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 758.306354] env[59534]: INFO nova.compute.manager [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] [instance: eaeb9be3-b904-4e62-8ed3-301829c6a2ab] Took 0.12 seconds to deallocate network for instance. [ 758.400241] env[59534]: INFO nova.scheduler.client.report [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Deleted allocations for instance eaeb9be3-b904-4e62-8ed3-301829c6a2ab [ 758.421656] env[59534]: DEBUG oslo_concurrency.lockutils [None req-512921ac-f802-4d63-8c8d-db93d3e05776 tempest-MultipleCreateTestJSON-1364053262 tempest-MultipleCreateTestJSON-1364053262-project-member] Lock "eaeb9be3-b904-4e62-8ed3-301829c6a2ab" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 15.474s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 758.993802] env[59534]: DEBUG nova.network.neutron [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 759.008484] env[59534]: DEBUG oslo_concurrency.lockutils [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Releasing lock "refresh_cache-3b30863b-8dc5-43e1-a222-ddcc6945af5f" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 759.008484] env[59534]: DEBUG nova.compute.manager [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 759.008704] env[59534]: DEBUG nova.compute.manager [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 759.008969] env[59534]: DEBUG nova.network.neutron [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 759.109295] env[59534]: DEBUG nova.network.neutron [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 759.124981] env[59534]: DEBUG nova.network.neutron [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 759.137927] env[59534]: INFO nova.compute.manager [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] [instance: 3b30863b-8dc5-43e1-a222-ddcc6945af5f] Took 0.13 seconds to deallocate network for instance. [ 759.256175] env[59534]: INFO nova.scheduler.client.report [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Deleted allocations for instance 3b30863b-8dc5-43e1-a222-ddcc6945af5f [ 759.287161] env[59534]: DEBUG oslo_concurrency.lockutils [None req-0be3ff61-e611-440f-b4e3-a36f7ef9627a tempest-ServersTestMultiNic-1670011942 tempest-ServersTestMultiNic-1670011942-project-member] Lock "3b30863b-8dc5-43e1-a222-ddcc6945af5f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 18.384s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 759.576075] env[59534]: ERROR nova.compute.manager [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port c2cc2e50-9814-4af3-8ecb-337708efc33f, please check neutron logs for more information. [ 759.576075] env[59534]: ERROR nova.compute.manager Traceback (most recent call last): [ 759.576075] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 759.576075] env[59534]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 759.576075] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 759.576075] env[59534]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 759.576075] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 759.576075] env[59534]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 759.576075] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 759.576075] env[59534]: ERROR nova.compute.manager self.force_reraise() [ 759.576075] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 759.576075] env[59534]: ERROR nova.compute.manager raise self.value [ 759.576075] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 759.576075] env[59534]: ERROR nova.compute.manager updated_port = self._update_port( [ 759.576075] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 759.576075] env[59534]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 759.577047] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 759.577047] env[59534]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 759.577047] env[59534]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port c2cc2e50-9814-4af3-8ecb-337708efc33f, please check neutron logs for more information. [ 759.577047] env[59534]: ERROR nova.compute.manager [ 759.577047] env[59534]: Traceback (most recent call last): [ 759.577047] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 759.577047] env[59534]: listener.cb(fileno) [ 759.577047] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 759.577047] env[59534]: result = function(*args, **kwargs) [ 759.577047] env[59534]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 759.577047] env[59534]: return func(*args, **kwargs) [ 759.577047] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 759.577047] env[59534]: raise e [ 759.577047] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 759.577047] env[59534]: nwinfo = self.network_api.allocate_for_instance( [ 759.577047] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 759.577047] env[59534]: created_port_ids = self._update_ports_for_instance( [ 759.577047] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 759.577047] env[59534]: with excutils.save_and_reraise_exception(): [ 759.577047] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 759.577047] env[59534]: self.force_reraise() [ 759.577047] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 759.577047] env[59534]: raise self.value [ 759.577047] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 759.577047] env[59534]: updated_port = self._update_port( [ 759.577047] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 759.577047] env[59534]: _ensure_no_port_binding_failure(port) [ 759.577047] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 759.577047] env[59534]: raise exception.PortBindingFailed(port_id=port['id']) [ 759.577774] env[59534]: nova.exception.PortBindingFailed: Binding failed for port c2cc2e50-9814-4af3-8ecb-337708efc33f, please check neutron logs for more information. [ 759.577774] env[59534]: Removing descriptor: 20 [ 759.577774] env[59534]: ERROR nova.compute.manager [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port c2cc2e50-9814-4af3-8ecb-337708efc33f, please check neutron logs for more information. [ 759.577774] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] Traceback (most recent call last): [ 759.577774] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 759.577774] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] yield resources [ 759.577774] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 759.577774] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] self.driver.spawn(context, instance, image_meta, [ 759.577774] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 759.577774] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] self._vmops.spawn(context, instance, image_meta, injected_files, [ 759.577774] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 759.577774] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] vm_ref = self.build_virtual_machine(instance, [ 759.578140] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 759.578140] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] vif_infos = vmwarevif.get_vif_info(self._session, [ 759.578140] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 759.578140] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] for vif in network_info: [ 759.578140] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 759.578140] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] return self._sync_wrapper(fn, *args, **kwargs) [ 759.578140] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 759.578140] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] self.wait() [ 759.578140] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 759.578140] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] self[:] = self._gt.wait() [ 759.578140] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 759.578140] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] return self._exit_event.wait() [ 759.578140] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 759.578140] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] result = hub.switch() [ 759.578487] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 759.578487] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] return self.greenlet.switch() [ 759.578487] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 759.578487] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] result = function(*args, **kwargs) [ 759.578487] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 759.578487] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] return func(*args, **kwargs) [ 759.578487] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 759.578487] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] raise e [ 759.578487] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 759.578487] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] nwinfo = self.network_api.allocate_for_instance( [ 759.578487] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 759.578487] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] created_port_ids = self._update_ports_for_instance( [ 759.578487] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 759.578822] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] with excutils.save_and_reraise_exception(): [ 759.578822] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 759.578822] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] self.force_reraise() [ 759.578822] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 759.578822] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] raise self.value [ 759.578822] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 759.578822] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] updated_port = self._update_port( [ 759.578822] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 759.578822] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] _ensure_no_port_binding_failure(port) [ 759.578822] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 759.578822] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] raise exception.PortBindingFailed(port_id=port['id']) [ 759.578822] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] nova.exception.PortBindingFailed: Binding failed for port c2cc2e50-9814-4af3-8ecb-337708efc33f, please check neutron logs for more information. [ 759.578822] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] [ 759.579208] env[59534]: INFO nova.compute.manager [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] Terminating instance [ 759.582057] env[59534]: DEBUG oslo_concurrency.lockutils [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Acquiring lock "refresh_cache-38811b76-3497-44ff-8569-fb1e5c3952bf" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 759.582196] env[59534]: DEBUG oslo_concurrency.lockutils [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Acquired lock "refresh_cache-38811b76-3497-44ff-8569-fb1e5c3952bf" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 759.582420] env[59534]: DEBUG nova.network.neutron [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 759.671020] env[59534]: DEBUG nova.network.neutron [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 760.433387] env[59534]: DEBUG nova.network.neutron [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 760.446835] env[59534]: DEBUG oslo_concurrency.lockutils [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Releasing lock "refresh_cache-38811b76-3497-44ff-8569-fb1e5c3952bf" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 760.448759] env[59534]: DEBUG nova.compute.manager [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] Start destroying the instance on the hypervisor. {{(pid=59534) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 760.448759] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] Destroying instance {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 760.449708] env[59534]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-7e64d7ee-3789-45c8-96a1-f4d147ed79f6 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 760.460932] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae79ec07-5e81-4cbf-a586-51d284a8f435 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 760.501725] env[59534]: WARNING nova.virt.vmwareapi.vmops [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 38811b76-3497-44ff-8569-fb1e5c3952bf could not be found. [ 760.501951] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] Instance destroyed {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 760.503906] env[59534]: INFO nova.compute.manager [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] Took 0.05 seconds to destroy the instance on the hypervisor. [ 760.503906] env[59534]: DEBUG oslo.service.loopingcall [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59534) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 760.504732] env[59534]: DEBUG nova.compute.manager [-] [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 760.504732] env[59534]: DEBUG nova.network.neutron [-] [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 760.511192] env[59534]: DEBUG nova.compute.manager [req-0357b07a-0ae0-4c85-ba1e-bec0a7e47fb7 req-547d0eb6-255c-4902-a714-0e2fe2a7f813 service nova] [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] Received event network-changed-c2cc2e50-9814-4af3-8ecb-337708efc33f {{(pid=59534) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 760.511382] env[59534]: DEBUG nova.compute.manager [req-0357b07a-0ae0-4c85-ba1e-bec0a7e47fb7 req-547d0eb6-255c-4902-a714-0e2fe2a7f813 service nova] [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] Refreshing instance network info cache due to event network-changed-c2cc2e50-9814-4af3-8ecb-337708efc33f. {{(pid=59534) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 760.511591] env[59534]: DEBUG oslo_concurrency.lockutils [req-0357b07a-0ae0-4c85-ba1e-bec0a7e47fb7 req-547d0eb6-255c-4902-a714-0e2fe2a7f813 service nova] Acquiring lock "refresh_cache-38811b76-3497-44ff-8569-fb1e5c3952bf" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 760.511730] env[59534]: DEBUG oslo_concurrency.lockutils [req-0357b07a-0ae0-4c85-ba1e-bec0a7e47fb7 req-547d0eb6-255c-4902-a714-0e2fe2a7f813 service nova] Acquired lock "refresh_cache-38811b76-3497-44ff-8569-fb1e5c3952bf" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 760.511922] env[59534]: DEBUG nova.network.neutron [req-0357b07a-0ae0-4c85-ba1e-bec0a7e47fb7 req-547d0eb6-255c-4902-a714-0e2fe2a7f813 service nova] [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] Refreshing network info cache for port c2cc2e50-9814-4af3-8ecb-337708efc33f {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 760.571930] env[59534]: DEBUG nova.network.neutron [-] [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 760.581778] env[59534]: DEBUG nova.network.neutron [-] [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 760.590446] env[59534]: DEBUG nova.network.neutron [req-0357b07a-0ae0-4c85-ba1e-bec0a7e47fb7 req-547d0eb6-255c-4902-a714-0e2fe2a7f813 service nova] [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 760.593791] env[59534]: INFO nova.compute.manager [-] [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] Took 0.09 seconds to deallocate network for instance. [ 760.597644] env[59534]: DEBUG nova.compute.claims [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] Aborting claim: {{(pid=59534) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 760.597811] env[59534]: DEBUG oslo_concurrency.lockutils [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 760.598029] env[59534]: DEBUG oslo_concurrency.lockutils [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 760.726073] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7250677c-c27c-4a05-a91e-9f198f9b5dde {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 760.736200] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a6d1d59-8e2f-4b06-80c3-bc95d0ae5741 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 760.774756] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-721f6e3a-bf2d-4de3-a53e-47f0f598e1d4 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 760.783377] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42bb79bd-6f26-4ae3-b264-d6d3fc1a126a {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 760.804606] env[59534]: DEBUG nova.compute.provider_tree [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 760.814847] env[59534]: DEBUG nova.scheduler.client.report [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 760.835481] env[59534]: DEBUG oslo_concurrency.lockutils [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.237s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 760.836231] env[59534]: ERROR nova.compute.manager [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port c2cc2e50-9814-4af3-8ecb-337708efc33f, please check neutron logs for more information. [ 760.836231] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] Traceback (most recent call last): [ 760.836231] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 760.836231] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] self.driver.spawn(context, instance, image_meta, [ 760.836231] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 760.836231] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] self._vmops.spawn(context, instance, image_meta, injected_files, [ 760.836231] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 760.836231] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] vm_ref = self.build_virtual_machine(instance, [ 760.836231] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 760.836231] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] vif_infos = vmwarevif.get_vif_info(self._session, [ 760.836231] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 760.836588] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] for vif in network_info: [ 760.836588] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 760.836588] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] return self._sync_wrapper(fn, *args, **kwargs) [ 760.836588] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 760.836588] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] self.wait() [ 760.836588] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 760.836588] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] self[:] = self._gt.wait() [ 760.836588] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 760.836588] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] return self._exit_event.wait() [ 760.836588] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 760.836588] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] result = hub.switch() [ 760.836588] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 760.836588] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] return self.greenlet.switch() [ 760.836588] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 760.837064] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] result = function(*args, **kwargs) [ 760.837064] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 760.837064] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] return func(*args, **kwargs) [ 760.837064] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 760.837064] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] raise e [ 760.837064] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 760.837064] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] nwinfo = self.network_api.allocate_for_instance( [ 760.837064] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 760.837064] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] created_port_ids = self._update_ports_for_instance( [ 760.837064] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 760.837064] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] with excutils.save_and_reraise_exception(): [ 760.837064] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 760.837064] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] self.force_reraise() [ 760.837450] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 760.837450] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] raise self.value [ 760.837450] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 760.837450] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] updated_port = self._update_port( [ 760.837450] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 760.837450] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] _ensure_no_port_binding_failure(port) [ 760.837450] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 760.837450] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] raise exception.PortBindingFailed(port_id=port['id']) [ 760.837450] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] nova.exception.PortBindingFailed: Binding failed for port c2cc2e50-9814-4af3-8ecb-337708efc33f, please check neutron logs for more information. [ 760.837450] env[59534]: ERROR nova.compute.manager [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] [ 760.837450] env[59534]: DEBUG nova.compute.utils [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] Binding failed for port c2cc2e50-9814-4af3-8ecb-337708efc33f, please check neutron logs for more information. {{(pid=59534) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 760.838662] env[59534]: DEBUG nova.compute.manager [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] Build of instance 38811b76-3497-44ff-8569-fb1e5c3952bf was re-scheduled: Binding failed for port c2cc2e50-9814-4af3-8ecb-337708efc33f, please check neutron logs for more information. {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 760.839085] env[59534]: DEBUG nova.compute.manager [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] Unplugging VIFs for instance {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 760.839278] env[59534]: DEBUG oslo_concurrency.lockutils [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Acquiring lock "refresh_cache-38811b76-3497-44ff-8569-fb1e5c3952bf" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 761.391307] env[59534]: DEBUG nova.network.neutron [req-0357b07a-0ae0-4c85-ba1e-bec0a7e47fb7 req-547d0eb6-255c-4902-a714-0e2fe2a7f813 service nova] [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 761.403211] env[59534]: DEBUG oslo_concurrency.lockutils [req-0357b07a-0ae0-4c85-ba1e-bec0a7e47fb7 req-547d0eb6-255c-4902-a714-0e2fe2a7f813 service nova] Releasing lock "refresh_cache-38811b76-3497-44ff-8569-fb1e5c3952bf" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 761.403411] env[59534]: DEBUG nova.compute.manager [req-0357b07a-0ae0-4c85-ba1e-bec0a7e47fb7 req-547d0eb6-255c-4902-a714-0e2fe2a7f813 service nova] [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] Received event network-vif-deleted-c2cc2e50-9814-4af3-8ecb-337708efc33f {{(pid=59534) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 761.403878] env[59534]: DEBUG oslo_concurrency.lockutils [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Acquired lock "refresh_cache-38811b76-3497-44ff-8569-fb1e5c3952bf" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 761.403878] env[59534]: DEBUG nova.network.neutron [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 761.514351] env[59534]: DEBUG nova.network.neutron [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 761.972719] env[59534]: DEBUG nova.network.neutron [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 761.986978] env[59534]: DEBUG oslo_concurrency.lockutils [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Releasing lock "refresh_cache-38811b76-3497-44ff-8569-fb1e5c3952bf" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 761.988594] env[59534]: DEBUG nova.compute.manager [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 761.988594] env[59534]: DEBUG nova.compute.manager [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 761.988594] env[59534]: DEBUG nova.network.neutron [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 762.043802] env[59534]: DEBUG nova.network.neutron [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 762.060254] env[59534]: DEBUG nova.network.neutron [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 762.075368] env[59534]: INFO nova.compute.manager [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] [instance: 38811b76-3497-44ff-8569-fb1e5c3952bf] Took 0.09 seconds to deallocate network for instance. [ 762.191954] env[59534]: INFO nova.scheduler.client.report [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Deleted allocations for instance 38811b76-3497-44ff-8569-fb1e5c3952bf [ 762.220913] env[59534]: DEBUG oslo_concurrency.lockutils [None req-5ef7512a-93a3-4d28-ad05-ae1cb13d7e8d tempest-ServerDiskConfigTestJSON-1220714776 tempest-ServerDiskConfigTestJSON-1220714776-project-member] Lock "38811b76-3497-44ff-8569-fb1e5c3952bf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 15.141s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 787.248146] env[59534]: WARNING oslo_vmware.rw_handles [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 787.248146] env[59534]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 787.248146] env[59534]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 787.248146] env[59534]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 787.248146] env[59534]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 787.248146] env[59534]: ERROR oslo_vmware.rw_handles response.begin() [ 787.248146] env[59534]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 787.248146] env[59534]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 787.248146] env[59534]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 787.248146] env[59534]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 787.248146] env[59534]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 787.248146] env[59534]: ERROR oslo_vmware.rw_handles [ 787.248843] env[59534]: DEBUG nova.virt.vmwareapi.images [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] Downloaded image file data ca8542a2-3ba7-4624-b2be-cd49a340ac21 to vmware_temp/83f48531-31c8-42bf-8615-4b9db46169d3/ca8542a2-3ba7-4624-b2be-cd49a340ac21/tmp-sparse.vmdk on the data store datastore1 {{(pid=59534) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 787.250454] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] Caching image {{(pid=59534) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 787.250699] env[59534]: DEBUG nova.virt.vmwareapi.vm_util [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Copying Virtual Disk [datastore1] vmware_temp/83f48531-31c8-42bf-8615-4b9db46169d3/ca8542a2-3ba7-4624-b2be-cd49a340ac21/tmp-sparse.vmdk to [datastore1] vmware_temp/83f48531-31c8-42bf-8615-4b9db46169d3/ca8542a2-3ba7-4624-b2be-cd49a340ac21/ca8542a2-3ba7-4624-b2be-cd49a340ac21.vmdk {{(pid=59534) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 787.250984] env[59534]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-c27ffa70-b435-468c-8e0b-7b82e2c56ff0 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 787.260710] env[59534]: DEBUG oslo_vmware.api [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Waiting for the task: (returnval){ [ 787.260710] env[59534]: value = "task-1308576" [ 787.260710] env[59534]: _type = "Task" [ 787.260710] env[59534]: } to complete. {{(pid=59534) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 787.269260] env[59534]: DEBUG oslo_vmware.api [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Task: {'id': task-1308576, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59534) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 787.774773] env[59534]: DEBUG oslo_vmware.exceptions [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Fault InvalidArgument not matched. {{(pid=59534) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 787.775514] env[59534]: DEBUG oslo_concurrency.lockutils [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Releasing lock "[datastore1] devstack-image-cache_base/ca8542a2-3ba7-4624-b2be-cd49a340ac21/ca8542a2-3ba7-4624-b2be-cd49a340ac21.vmdk" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 787.776184] env[59534]: ERROR nova.compute.manager [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 787.776184] env[59534]: Faults: ['InvalidArgument'] [ 787.776184] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] Traceback (most recent call last): [ 787.776184] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 787.776184] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] yield resources [ 787.776184] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 787.776184] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] self.driver.spawn(context, instance, image_meta, [ 787.776184] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 787.776184] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 787.776184] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 787.776184] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] self._fetch_image_if_missing(context, vi) [ 787.776184] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 787.776678] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] image_cache(vi, tmp_image_ds_loc) [ 787.776678] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 787.776678] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] vm_util.copy_virtual_disk( [ 787.776678] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 787.776678] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] session._wait_for_task(vmdk_copy_task) [ 787.776678] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 787.776678] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] return self.wait_for_task(task_ref) [ 787.776678] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 787.776678] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] return evt.wait() [ 787.776678] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 787.776678] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] result = hub.switch() [ 787.776678] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 787.776678] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] return self.greenlet.switch() [ 787.777149] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 787.777149] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] self.f(*self.args, **self.kw) [ 787.777149] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 787.777149] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] raise exceptions.translate_fault(task_info.error) [ 787.777149] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 787.777149] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] Faults: ['InvalidArgument'] [ 787.777149] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] [ 787.777149] env[59534]: INFO nova.compute.manager [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] Terminating instance [ 787.778286] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Acquired lock "[datastore1] devstack-image-cache_base/ca8542a2-3ba7-4624-b2be-cd49a340ac21/ca8542a2-3ba7-4624-b2be-cd49a340ac21.vmdk" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 787.778488] env[59534]: DEBUG nova.virt.vmwareapi.ds_util [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59534) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 787.778721] env[59534]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8fb4ce8a-0bf7-4615-93b8-b34aa568ed78 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 787.781024] env[59534]: DEBUG oslo_concurrency.lockutils [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Acquiring lock "refresh_cache-5a549ffd-3cc3-4723-bfe6-510dbef0fea7" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 787.781188] env[59534]: DEBUG oslo_concurrency.lockutils [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Acquired lock "refresh_cache-5a549ffd-3cc3-4723-bfe6-510dbef0fea7" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 787.781360] env[59534]: DEBUG nova.network.neutron [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 787.790688] env[59534]: DEBUG nova.virt.vmwareapi.ds_util [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59534) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 787.790864] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59534) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 787.793990] env[59534]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-818bb3ed-bbe3-4024-a9c2-7a8deb4b2cc7 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 787.800859] env[59534]: DEBUG oslo_vmware.api [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Waiting for the task: (returnval){ [ 787.800859] env[59534]: value = "session[529ab406-b1c9-3c06-ce96-9015eeabf2c3]52780db2-ff4a-8a21-950f-a4e729a8420f" [ 787.800859] env[59534]: _type = "Task" [ 787.800859] env[59534]: } to complete. {{(pid=59534) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 787.814632] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] Preparing fetch location {{(pid=59534) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 787.815463] env[59534]: DEBUG nova.virt.vmwareapi.ds_util [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Creating directory with path [datastore1] vmware_temp/5fc1d6d7-bec9-4b86-ac9e-cb4d596c1265/ca8542a2-3ba7-4624-b2be-cd49a340ac21 {{(pid=59534) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 787.815463] env[59534]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9567eb1a-9c6a-4f57-98df-26b3f4e0bb58 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 787.841425] env[59534]: DEBUG nova.virt.vmwareapi.ds_util [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Created directory with path [datastore1] vmware_temp/5fc1d6d7-bec9-4b86-ac9e-cb4d596c1265/ca8542a2-3ba7-4624-b2be-cd49a340ac21 {{(pid=59534) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 787.842671] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] Fetch image to [datastore1] vmware_temp/5fc1d6d7-bec9-4b86-ac9e-cb4d596c1265/ca8542a2-3ba7-4624-b2be-cd49a340ac21/tmp-sparse.vmdk {{(pid=59534) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 787.842671] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] Downloading image file data ca8542a2-3ba7-4624-b2be-cd49a340ac21 to [datastore1] vmware_temp/5fc1d6d7-bec9-4b86-ac9e-cb4d596c1265/ca8542a2-3ba7-4624-b2be-cd49a340ac21/tmp-sparse.vmdk on the data store datastore1 {{(pid=59534) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 787.843671] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f1f7b77-be89-48f0-85cb-efa7faf91887 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 787.851559] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66645e04-38c0-49ec-82d3-4865d70e45ab {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 787.866248] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1cd24d8-bf78-4294-8f4d-4108deb157cf {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 787.903984] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0891577d-ee3e-42fc-a6b6-334d4ebe34fc {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 787.908916] env[59534]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-2d761152-bb78-4597-b650-3ab7c6f201e9 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 787.936650] env[59534]: DEBUG nova.virt.vmwareapi.images [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] Downloading image file data ca8542a2-3ba7-4624-b2be-cd49a340ac21 to the data store datastore1 {{(pid=59534) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 788.003207] env[59534]: DEBUG oslo_vmware.rw_handles [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5fc1d6d7-bec9-4b86-ac9e-cb4d596c1265/ca8542a2-3ba7-4624-b2be-cd49a340ac21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59534) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 788.057691] env[59534]: DEBUG nova.network.neutron [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 788.062179] env[59534]: DEBUG oslo_vmware.rw_handles [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Completed reading data from the image iterator. {{(pid=59534) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 788.062179] env[59534]: DEBUG oslo_vmware.rw_handles [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5fc1d6d7-bec9-4b86-ac9e-cb4d596c1265/ca8542a2-3ba7-4624-b2be-cd49a340ac21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59534) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 788.211713] env[59534]: DEBUG nova.network.neutron [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 788.222800] env[59534]: DEBUG oslo_concurrency.lockutils [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Releasing lock "refresh_cache-5a549ffd-3cc3-4723-bfe6-510dbef0fea7" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 788.223902] env[59534]: DEBUG nova.compute.manager [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] Start destroying the instance on the hypervisor. {{(pid=59534) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 788.223902] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] Destroying instance {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 788.224526] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b83ec89c-fd53-460d-b62b-9b628f6fc8ee {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 788.240209] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] Unregistering the VM {{(pid=59534) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 788.240435] env[59534]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-70e5f43b-ae43-4990-9683-48f33a4e9651 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 788.279789] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] Unregistered the VM {{(pid=59534) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 788.280137] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] Deleting contents of the VM from datastore datastore1 {{(pid=59534) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 788.280200] env[59534]: DEBUG nova.virt.vmwareapi.ds_util [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Deleting the datastore file [datastore1] 5a549ffd-3cc3-4723-bfe6-510dbef0fea7 {{(pid=59534) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 788.284023] env[59534]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-cc5248d6-1c54-4094-8fc6-06e45953c93f {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 788.288696] env[59534]: DEBUG oslo_vmware.api [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Waiting for the task: (returnval){ [ 788.288696] env[59534]: value = "task-1308578" [ 788.288696] env[59534]: _type = "Task" [ 788.288696] env[59534]: } to complete. {{(pid=59534) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 788.303283] env[59534]: DEBUG oslo_vmware.api [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Task: {'id': task-1308578, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59534) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 788.799846] env[59534]: DEBUG oslo_vmware.api [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Task: {'id': task-1308578, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.040402} completed successfully. {{(pid=59534) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 788.800267] env[59534]: DEBUG nova.virt.vmwareapi.ds_util [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Deleted the datastore file {{(pid=59534) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 788.800712] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] Deleted contents of the VM from datastore datastore1 {{(pid=59534) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 788.800849] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] Instance destroyed {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 788.801171] env[59534]: INFO nova.compute.manager [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] Took 0.58 seconds to destroy the instance on the hypervisor. [ 788.801489] env[59534]: DEBUG oslo.service.loopingcall [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59534) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 788.802033] env[59534]: DEBUG nova.compute.manager [-] [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] Skipping network deallocation for instance since networking was not requested. {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 788.804410] env[59534]: DEBUG nova.compute.claims [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] Aborting claim: {{(pid=59534) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 788.804675] env[59534]: DEBUG oslo_concurrency.lockutils [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 788.804928] env[59534]: DEBUG oslo_concurrency.lockutils [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 788.882887] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ae7ca11-adfe-4f85-bf9a-01f0a5ffe115 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 788.891997] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-54e9f406-d6ee-40b8-b971-3a0824ddccc4 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 788.932979] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f13770a4-8dd4-4bc5-9810-fe676a27acb6 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 788.942750] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21373fa8-b3de-4494-b045-fef9ba41a8fb {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 788.960525] env[59534]: DEBUG nova.compute.provider_tree [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 788.970862] env[59534]: DEBUG nova.scheduler.client.report [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 788.985092] env[59534]: DEBUG oslo_concurrency.lockutils [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.180s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 788.985624] env[59534]: ERROR nova.compute.manager [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 788.985624] env[59534]: Faults: ['InvalidArgument'] [ 788.985624] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] Traceback (most recent call last): [ 788.985624] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 788.985624] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] self.driver.spawn(context, instance, image_meta, [ 788.985624] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 788.985624] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 788.985624] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 788.985624] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] self._fetch_image_if_missing(context, vi) [ 788.985624] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 788.985624] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] image_cache(vi, tmp_image_ds_loc) [ 788.985624] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 788.986303] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] vm_util.copy_virtual_disk( [ 788.986303] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 788.986303] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] session._wait_for_task(vmdk_copy_task) [ 788.986303] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 788.986303] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] return self.wait_for_task(task_ref) [ 788.986303] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 788.986303] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] return evt.wait() [ 788.986303] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 788.986303] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] result = hub.switch() [ 788.986303] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 788.986303] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] return self.greenlet.switch() [ 788.986303] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 788.986303] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] self.f(*self.args, **self.kw) [ 788.986864] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 788.986864] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] raise exceptions.translate_fault(task_info.error) [ 788.986864] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 788.986864] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] Faults: ['InvalidArgument'] [ 788.986864] env[59534]: ERROR nova.compute.manager [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] [ 788.986864] env[59534]: DEBUG nova.compute.utils [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] VimFaultException {{(pid=59534) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 788.991428] env[59534]: DEBUG nova.compute.manager [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] Build of instance 5a549ffd-3cc3-4723-bfe6-510dbef0fea7 was re-scheduled: A specified parameter was not correct: fileType [ 788.991428] env[59534]: Faults: ['InvalidArgument'] {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 788.991812] env[59534]: DEBUG nova.compute.manager [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] Unplugging VIFs for instance {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 788.992036] env[59534]: DEBUG oslo_concurrency.lockutils [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Acquiring lock "refresh_cache-5a549ffd-3cc3-4723-bfe6-510dbef0fea7" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 788.992185] env[59534]: DEBUG oslo_concurrency.lockutils [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Acquired lock "refresh_cache-5a549ffd-3cc3-4723-bfe6-510dbef0fea7" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 788.992850] env[59534]: DEBUG nova.network.neutron [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 789.029684] env[59534]: DEBUG nova.network.neutron [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 789.133493] env[59534]: DEBUG nova.network.neutron [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 789.143720] env[59534]: DEBUG oslo_concurrency.lockutils [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Releasing lock "refresh_cache-5a549ffd-3cc3-4723-bfe6-510dbef0fea7" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 789.144044] env[59534]: DEBUG nova.compute.manager [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 789.144300] env[59534]: DEBUG nova.compute.manager [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 5a549ffd-3cc3-4723-bfe6-510dbef0fea7] Skipping network deallocation for instance since networking was not requested. {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 789.238836] env[59534]: INFO nova.scheduler.client.report [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Deleted allocations for instance 5a549ffd-3cc3-4723-bfe6-510dbef0fea7 [ 789.264847] env[59534]: DEBUG oslo_concurrency.lockutils [None req-ca7f4b7c-9942-42e1-b38d-261c70a40fd3 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Lock "5a549ffd-3cc3-4723-bfe6-510dbef0fea7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 79.878s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 798.705960] env[59534]: DEBUG oslo_service.periodic_task [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=59534) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 799.687068] env[59534]: DEBUG oslo_service.periodic_task [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59534) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 799.687331] env[59534]: DEBUG oslo_service.periodic_task [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59534) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 800.687491] env[59534]: DEBUG oslo_service.periodic_task [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59534) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 800.687845] env[59534]: DEBUG oslo_service.periodic_task [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59534) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 800.698300] env[59534]: DEBUG oslo_concurrency.lockutils [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 800.698513] env[59534]: DEBUG oslo_concurrency.lockutils [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 800.698675] env[59534]: DEBUG oslo_concurrency.lockutils [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 800.698828] env[59534]: DEBUG nova.compute.resource_tracker [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59534) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 800.699898] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d108585d-6196-42bc-bdbc-8f1fa5787182 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 800.709181] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d61b909-8d89-4789-81aa-61c2b3b42549 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 800.725225] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6790d40b-2cb4-4a78-8e7f-6f58a5f98fd8 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 800.732420] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d33e045-0bbb-4096-989b-fe42c8c08e67 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 800.761382] env[59534]: DEBUG nova.compute.resource_tracker [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181463MB free_disk=136GB free_vcpus=48 pci_devices=None {{(pid=59534) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 800.761554] env[59534]: DEBUG oslo_concurrency.lockutils [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 800.761712] env[59534]: DEBUG oslo_concurrency.lockutils [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 800.797081] env[59534]: DEBUG nova.compute.resource_tracker [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Instance 1d6fb105-7087-4bdf-9b1c-b194baf39a55 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59534) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 800.797288] env[59534]: DEBUG nova.compute.resource_tracker [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Total usable vcpus: 48, total allocated vcpus: 1 {{(pid=59534) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 800.797456] env[59534]: DEBUG nova.compute.resource_tracker [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=640MB phys_disk=200GB used_disk=1GB total_vcpus=48 used_vcpus=1 pci_stats=[] {{(pid=59534) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 800.823519] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da3f6b65-7603-423e-bdd6-bf3b0dd7106e {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 800.831775] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a13362d-434b-4d8c-8370-898778d4150e {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 800.864526] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fff1cac4-d4ba-4dbb-a743-44df2f802981 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 800.872715] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-697a1d4c-6f76-4e1c-a054-41cddf732877 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 800.886723] env[59534]: DEBUG nova.compute.provider_tree [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 800.895373] env[59534]: DEBUG nova.scheduler.client.report [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 800.909864] env[59534]: DEBUG nova.compute.resource_tracker [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59534) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 800.910069] env[59534]: DEBUG oslo_concurrency.lockutils [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.148s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 801.910371] env[59534]: DEBUG oslo_service.periodic_task [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59534) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 801.910768] env[59534]: DEBUG nova.compute.manager [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Starting heal instance info cache {{(pid=59534) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 801.910768] env[59534]: DEBUG nova.compute.manager [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Rebuilding the list of instances to heal {{(pid=59534) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 801.921134] env[59534]: DEBUG nova.compute.manager [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] Skipping network cache update for instance because it is Building. {{(pid=59534) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 801.921294] env[59534]: DEBUG nova.compute.manager [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Didn't find any instances for network info cache update. {{(pid=59534) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 802.687364] env[59534]: DEBUG oslo_service.periodic_task [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59534) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 802.687576] env[59534]: DEBUG oslo_service.periodic_task [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59534) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 802.687730] env[59534]: DEBUG oslo_service.periodic_task [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59534) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 802.687890] env[59534]: DEBUG oslo_service.periodic_task [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59534) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 802.688043] env[59534]: DEBUG nova.compute.manager [None req-fa596bd4-b70f-4888-b135-f397c86a1e56 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59534) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 808.401291] env[59534]: DEBUG oslo_concurrency.lockutils [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] Acquiring lock "a6df92d0-111a-4350-9aec-97c2fbaa98d0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 808.401586] env[59534]: DEBUG oslo_concurrency.lockutils [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] Lock "a6df92d0-111a-4350-9aec-97c2fbaa98d0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 808.410187] env[59534]: DEBUG nova.compute.manager [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] Starting instance... {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 808.456460] env[59534]: DEBUG oslo_concurrency.lockutils [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 808.456815] env[59534]: DEBUG oslo_concurrency.lockutils [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 808.458415] env[59534]: INFO nova.compute.claims [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 808.535293] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-758a5abc-ca50-44f8-b83b-17a6d4540a10 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 808.543227] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61cb0ec2-603d-4348-aa0c-92f9bbe2eb7a {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 808.573213] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e8a9ae56-87e5-4d4d-be72-384d94333402 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 808.580109] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c6c4ecc-e630-4471-a2f8-2465750a3f74 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 808.592795] env[59534]: DEBUG nova.compute.provider_tree [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 808.601517] env[59534]: DEBUG nova.scheduler.client.report [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 808.613931] env[59534]: DEBUG oslo_concurrency.lockutils [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.157s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 808.614297] env[59534]: DEBUG nova.compute.manager [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] Start building networks asynchronously for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 808.644912] env[59534]: DEBUG nova.compute.utils [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] Using /dev/sd instead of None {{(pid=59534) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 808.646270] env[59534]: DEBUG nova.compute.manager [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] Allocating IP information in the background. {{(pid=59534) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 808.646437] env[59534]: DEBUG nova.network.neutron [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] allocate_for_instance() {{(pid=59534) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 808.653908] env[59534]: DEBUG nova.compute.manager [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] Start building block device mappings for instance. {{(pid=59534) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 808.683243] env[59534]: INFO nova.virt.block_device [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] Booting with volume 56fa55c7-7619-453e-895a-d9de4721304e at /dev/sda [ 808.701593] env[59534]: DEBUG nova.policy [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '29f50a71b01141018a816af034445b3b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '30c2d3d60b9a401b85542e94473667dd', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59534) authorize /opt/stack/nova/nova/policy.py:203}} [ 808.727559] env[59534]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-1c95e835-9c84-4dff-bfe7-fb49b3861e12 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 808.736645] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b519d2d0-559e-4654-a162-472a4c97fd4d {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 808.760834] env[59534]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-9cfb34a1-e76c-4a46-941b-d207d3bdeedb {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 808.769048] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e74ef5b-b762-47ab-a63c-7f1837c4b3c2 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 808.791800] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc380509-4ed6-4293-a123-0aaa73526674 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 808.798284] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3674ea91-6188-4466-a0b3-20db6876df5c {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 808.812871] env[59534]: DEBUG nova.virt.block_device [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] Updating existing volume attachment record: a9ba1279-38da-4d4c-812c-cda1cd074166 {{(pid=59534) _volume_attach /opt/stack/nova/nova/virt/block_device.py:631}} [ 809.008745] env[59534]: DEBUG nova.network.neutron [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] Successfully created port: 25f844d2-5f4b-419d-b36a-d6e761dfd77d {{(pid=59534) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 809.026449] env[59534]: DEBUG nova.compute.manager [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] Start spawning the instance on the hypervisor. {{(pid=59534) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 809.026449] env[59534]: DEBUG nova.virt.hardware [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T19:44:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=,container_format=,created_at=,direct_url=,disk_format=,id=,min_disk=0,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=1073741824,status='active',tags=,updated_at=,virtual_size=,visibility=), allow threads: False {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 809.026449] env[59534]: DEBUG nova.virt.hardware [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] Flavor limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 809.026774] env[59534]: DEBUG nova.virt.hardware [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] Image limits 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 809.026774] env[59534]: DEBUG nova.virt.hardware [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] Flavor pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 809.026774] env[59534]: DEBUG nova.virt.hardware [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] Image pref 0:0:0 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 809.026774] env[59534]: DEBUG nova.virt.hardware [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59534) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 809.026774] env[59534]: DEBUG nova.virt.hardware [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 809.026963] env[59534]: DEBUG nova.virt.hardware [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 809.027070] env[59534]: DEBUG nova.virt.hardware [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] Got 1 possible topologies {{(pid=59534) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 809.027171] env[59534]: DEBUG nova.virt.hardware [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 809.027336] env[59534]: DEBUG nova.virt.hardware [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59534) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 809.028807] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-baee74d8-0c47-4fa8-81b7-2d129bcd4bf0 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 809.037376] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9e1331c-2749-4b33-bc9a-105c99ddd62c {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 809.738588] env[59534]: DEBUG nova.compute.manager [req-1674f6e4-a7cc-44e2-8f91-0a7c9f048ead req-df2e2bee-e5bd-4cec-a686-1e428be1ff0f service nova] [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] Received event network-changed-25f844d2-5f4b-419d-b36a-d6e761dfd77d {{(pid=59534) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 809.738890] env[59534]: DEBUG nova.compute.manager [req-1674f6e4-a7cc-44e2-8f91-0a7c9f048ead req-df2e2bee-e5bd-4cec-a686-1e428be1ff0f service nova] [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] Refreshing instance network info cache due to event network-changed-25f844d2-5f4b-419d-b36a-d6e761dfd77d. {{(pid=59534) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 809.739194] env[59534]: DEBUG oslo_concurrency.lockutils [req-1674f6e4-a7cc-44e2-8f91-0a7c9f048ead req-df2e2bee-e5bd-4cec-a686-1e428be1ff0f service nova] Acquiring lock "refresh_cache-a6df92d0-111a-4350-9aec-97c2fbaa98d0" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 809.739392] env[59534]: DEBUG oslo_concurrency.lockutils [req-1674f6e4-a7cc-44e2-8f91-0a7c9f048ead req-df2e2bee-e5bd-4cec-a686-1e428be1ff0f service nova] Acquired lock "refresh_cache-a6df92d0-111a-4350-9aec-97c2fbaa98d0" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 809.739604] env[59534]: DEBUG nova.network.neutron [req-1674f6e4-a7cc-44e2-8f91-0a7c9f048ead req-df2e2bee-e5bd-4cec-a686-1e428be1ff0f service nova] [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] Refreshing network info cache for port 25f844d2-5f4b-419d-b36a-d6e761dfd77d {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 809.786793] env[59534]: DEBUG nova.network.neutron [req-1674f6e4-a7cc-44e2-8f91-0a7c9f048ead req-df2e2bee-e5bd-4cec-a686-1e428be1ff0f service nova] [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 810.119655] env[59534]: DEBUG nova.network.neutron [req-1674f6e4-a7cc-44e2-8f91-0a7c9f048ead req-df2e2bee-e5bd-4cec-a686-1e428be1ff0f service nova] [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 810.128294] env[59534]: DEBUG oslo_concurrency.lockutils [req-1674f6e4-a7cc-44e2-8f91-0a7c9f048ead req-df2e2bee-e5bd-4cec-a686-1e428be1ff0f service nova] Releasing lock "refresh_cache-a6df92d0-111a-4350-9aec-97c2fbaa98d0" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 810.179445] env[59534]: ERROR nova.compute.manager [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 25f844d2-5f4b-419d-b36a-d6e761dfd77d, please check neutron logs for more information. [ 810.179445] env[59534]: ERROR nova.compute.manager Traceback (most recent call last): [ 810.179445] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 810.179445] env[59534]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 810.179445] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 810.179445] env[59534]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 810.179445] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 810.179445] env[59534]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 810.179445] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 810.179445] env[59534]: ERROR nova.compute.manager self.force_reraise() [ 810.179445] env[59534]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 810.179445] env[59534]: ERROR nova.compute.manager raise self.value [ 810.179445] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 810.179445] env[59534]: ERROR nova.compute.manager updated_port = self._update_port( [ 810.179445] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 810.179445] env[59534]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 810.180045] env[59534]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 810.180045] env[59534]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 810.180045] env[59534]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 25f844d2-5f4b-419d-b36a-d6e761dfd77d, please check neutron logs for more information. [ 810.180045] env[59534]: ERROR nova.compute.manager [ 810.180045] env[59534]: Traceback (most recent call last): [ 810.180045] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 810.180045] env[59534]: listener.cb(fileno) [ 810.180045] env[59534]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 810.180045] env[59534]: result = function(*args, **kwargs) [ 810.180045] env[59534]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 810.180045] env[59534]: return func(*args, **kwargs) [ 810.180045] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 810.180045] env[59534]: raise e [ 810.180045] env[59534]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 810.180045] env[59534]: nwinfo = self.network_api.allocate_for_instance( [ 810.180045] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 810.180045] env[59534]: created_port_ids = self._update_ports_for_instance( [ 810.180045] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 810.180045] env[59534]: with excutils.save_and_reraise_exception(): [ 810.180045] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 810.180045] env[59534]: self.force_reraise() [ 810.180045] env[59534]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 810.180045] env[59534]: raise self.value [ 810.180045] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 810.180045] env[59534]: updated_port = self._update_port( [ 810.180045] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 810.180045] env[59534]: _ensure_no_port_binding_failure(port) [ 810.180045] env[59534]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 810.180045] env[59534]: raise exception.PortBindingFailed(port_id=port['id']) [ 810.181174] env[59534]: nova.exception.PortBindingFailed: Binding failed for port 25f844d2-5f4b-419d-b36a-d6e761dfd77d, please check neutron logs for more information. [ 810.181174] env[59534]: Removing descriptor: 21 [ 810.181174] env[59534]: ERROR nova.compute.manager [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 25f844d2-5f4b-419d-b36a-d6e761dfd77d, please check neutron logs for more information. [ 810.181174] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] Traceback (most recent call last): [ 810.181174] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 810.181174] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] yield resources [ 810.181174] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 810.181174] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] self.driver.spawn(context, instance, image_meta, [ 810.181174] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 810.181174] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 810.181174] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 810.181174] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] vm_ref = self.build_virtual_machine(instance, [ 810.181717] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 810.181717] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] vif_infos = vmwarevif.get_vif_info(self._session, [ 810.181717] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 810.181717] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] for vif in network_info: [ 810.181717] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 810.181717] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] return self._sync_wrapper(fn, *args, **kwargs) [ 810.181717] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 810.181717] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] self.wait() [ 810.181717] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 810.181717] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] self[:] = self._gt.wait() [ 810.181717] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 810.181717] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] return self._exit_event.wait() [ 810.181717] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 810.181717] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] result = hub.switch() [ 810.182191] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 810.182191] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] return self.greenlet.switch() [ 810.182191] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 810.182191] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] result = function(*args, **kwargs) [ 810.182191] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 810.182191] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] return func(*args, **kwargs) [ 810.182191] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 810.182191] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] raise e [ 810.182191] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 810.182191] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] nwinfo = self.network_api.allocate_for_instance( [ 810.182191] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 810.182191] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] created_port_ids = self._update_ports_for_instance( [ 810.182191] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 810.182617] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] with excutils.save_and_reraise_exception(): [ 810.182617] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 810.182617] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] self.force_reraise() [ 810.182617] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 810.182617] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] raise self.value [ 810.182617] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 810.182617] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] updated_port = self._update_port( [ 810.182617] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 810.182617] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] _ensure_no_port_binding_failure(port) [ 810.182617] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 810.182617] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] raise exception.PortBindingFailed(port_id=port['id']) [ 810.182617] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] nova.exception.PortBindingFailed: Binding failed for port 25f844d2-5f4b-419d-b36a-d6e761dfd77d, please check neutron logs for more information. [ 810.182617] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] [ 810.183054] env[59534]: INFO nova.compute.manager [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] Terminating instance [ 810.184750] env[59534]: DEBUG oslo_concurrency.lockutils [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] Acquiring lock "refresh_cache-a6df92d0-111a-4350-9aec-97c2fbaa98d0" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 810.184907] env[59534]: DEBUG oslo_concurrency.lockutils [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] Acquired lock "refresh_cache-a6df92d0-111a-4350-9aec-97c2fbaa98d0" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 810.185077] env[59534]: DEBUG nova.network.neutron [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 810.211267] env[59534]: DEBUG nova.network.neutron [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 810.310737] env[59534]: DEBUG nova.network.neutron [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 810.320188] env[59534]: DEBUG oslo_concurrency.lockutils [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] Releasing lock "refresh_cache-a6df92d0-111a-4350-9aec-97c2fbaa98d0" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 810.320977] env[59534]: DEBUG nova.compute.manager [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] Start destroying the instance on the hypervisor. {{(pid=59534) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 810.321311] env[59534]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-86ed7167-7b8f-4db4-984a-5d10b714e854 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 810.331478] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84e5fd0f-c6f5-4938-87f6-944dad5f4cfe {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 810.355356] env[59534]: WARNING nova.virt.vmwareapi.driver [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] Instance does not exists. Proceeding to delete instance properties on datastore: nova.exception.InstanceNotFound: Instance a6df92d0-111a-4350-9aec-97c2fbaa98d0 could not be found. [ 810.355569] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] Destroying instance {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 810.355836] env[59534]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-d2c0c531-2a10-4396-bfa4-5a114147a453 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 810.363802] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-addd78f6-5655-4abc-a93c-758a40d79021 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 810.385907] env[59534]: WARNING nova.virt.vmwareapi.vmops [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance a6df92d0-111a-4350-9aec-97c2fbaa98d0 could not be found. [ 810.386134] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] Instance destroyed {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 810.386314] env[59534]: INFO nova.compute.manager [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] Took 0.07 seconds to destroy the instance on the hypervisor. [ 810.386545] env[59534]: DEBUG oslo.service.loopingcall [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59534) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 810.386816] env[59534]: DEBUG nova.compute.manager [-] [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 810.386925] env[59534]: DEBUG nova.network.neutron [-] [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 810.402965] env[59534]: DEBUG nova.network.neutron [-] [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 810.410362] env[59534]: DEBUG nova.network.neutron [-] [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 810.418922] env[59534]: INFO nova.compute.manager [-] [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] Took 0.03 seconds to deallocate network for instance. [ 810.477588] env[59534]: INFO nova.compute.manager [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] Took 0.06 seconds to detach 1 volumes for instance. [ 810.480228] env[59534]: DEBUG nova.compute.claims [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] Aborting claim: {{(pid=59534) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 810.480469] env[59534]: DEBUG oslo_concurrency.lockutils [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 810.480606] env[59534]: DEBUG oslo_concurrency.lockutils [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 810.547368] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f8cc14f-bab7-49f2-af10-6d667dafc797 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 810.555478] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a9248b3-b278-4a95-8d3c-8bbc901291c2 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 810.586096] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ca45442-945e-4698-92bd-7e35803fede5 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 810.593406] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe6d680d-9f49-4d9d-937e-f52576d11714 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 810.606443] env[59534]: DEBUG nova.compute.provider_tree [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 810.615107] env[59534]: DEBUG nova.scheduler.client.report [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 810.628259] env[59534]: DEBUG oslo_concurrency.lockutils [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.148s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 810.628941] env[59534]: ERROR nova.compute.manager [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 25f844d2-5f4b-419d-b36a-d6e761dfd77d, please check neutron logs for more information. [ 810.628941] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] Traceback (most recent call last): [ 810.628941] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 810.628941] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] self.driver.spawn(context, instance, image_meta, [ 810.628941] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 810.628941] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 810.628941] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 810.628941] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] vm_ref = self.build_virtual_machine(instance, [ 810.628941] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 810.628941] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] vif_infos = vmwarevif.get_vif_info(self._session, [ 810.628941] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 810.629352] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] for vif in network_info: [ 810.629352] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 810.629352] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] return self._sync_wrapper(fn, *args, **kwargs) [ 810.629352] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 810.629352] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] self.wait() [ 810.629352] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 810.629352] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] self[:] = self._gt.wait() [ 810.629352] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 810.629352] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] return self._exit_event.wait() [ 810.629352] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 810.629352] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] result = hub.switch() [ 810.629352] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 810.629352] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] return self.greenlet.switch() [ 810.629352] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 810.629798] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] result = function(*args, **kwargs) [ 810.629798] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 810.629798] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] return func(*args, **kwargs) [ 810.629798] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 810.629798] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] raise e [ 810.629798] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 810.629798] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] nwinfo = self.network_api.allocate_for_instance( [ 810.629798] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 810.629798] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] created_port_ids = self._update_ports_for_instance( [ 810.629798] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 810.629798] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] with excutils.save_and_reraise_exception(): [ 810.629798] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 810.629798] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] self.force_reraise() [ 810.630226] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 810.630226] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] raise self.value [ 810.630226] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 810.630226] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] updated_port = self._update_port( [ 810.630226] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 810.630226] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] _ensure_no_port_binding_failure(port) [ 810.630226] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 810.630226] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] raise exception.PortBindingFailed(port_id=port['id']) [ 810.630226] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] nova.exception.PortBindingFailed: Binding failed for port 25f844d2-5f4b-419d-b36a-d6e761dfd77d, please check neutron logs for more information. [ 810.630226] env[59534]: ERROR nova.compute.manager [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] [ 810.630226] env[59534]: DEBUG nova.compute.utils [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] Binding failed for port 25f844d2-5f4b-419d-b36a-d6e761dfd77d, please check neutron logs for more information. {{(pid=59534) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 810.631347] env[59534]: DEBUG nova.compute.manager [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] Build of instance a6df92d0-111a-4350-9aec-97c2fbaa98d0 was re-scheduled: Binding failed for port 25f844d2-5f4b-419d-b36a-d6e761dfd77d, please check neutron logs for more information. {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 810.631789] env[59534]: DEBUG nova.compute.manager [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] Unplugging VIFs for instance {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 810.632013] env[59534]: DEBUG oslo_concurrency.lockutils [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] Acquiring lock "refresh_cache-a6df92d0-111a-4350-9aec-97c2fbaa98d0" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 810.632164] env[59534]: DEBUG oslo_concurrency.lockutils [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] Acquired lock "refresh_cache-a6df92d0-111a-4350-9aec-97c2fbaa98d0" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 810.632319] env[59534]: DEBUG nova.network.neutron [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 810.655300] env[59534]: DEBUG nova.network.neutron [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 810.750485] env[59534]: DEBUG nova.network.neutron [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 810.761479] env[59534]: DEBUG oslo_concurrency.lockutils [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] Releasing lock "refresh_cache-a6df92d0-111a-4350-9aec-97c2fbaa98d0" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 810.761684] env[59534]: DEBUG nova.compute.manager [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 810.761866] env[59534]: DEBUG nova.compute.manager [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] Deallocating network for instance {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 810.762037] env[59534]: DEBUG nova.network.neutron [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] deallocate_for_instance() {{(pid=59534) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 810.777280] env[59534]: DEBUG nova.network.neutron [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 810.784324] env[59534]: DEBUG nova.network.neutron [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 810.791136] env[59534]: INFO nova.compute.manager [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] [instance: a6df92d0-111a-4350-9aec-97c2fbaa98d0] Took 0.03 seconds to deallocate network for instance. [ 810.869067] env[59534]: INFO nova.scheduler.client.report [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] Deleted allocations for instance a6df92d0-111a-4350-9aec-97c2fbaa98d0 [ 810.883154] env[59534]: DEBUG oslo_concurrency.lockutils [None req-80ec4edb-8b27-47f7-a3f4-ea69798bd613 tempest-ServerActionsV293TestJSON-1617646457 tempest-ServerActionsV293TestJSON-1617646457-project-member] Lock "a6df92d0-111a-4350-9aec-97c2fbaa98d0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 2.481s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 833.733040] env[59534]: WARNING oslo_vmware.rw_handles [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 833.733040] env[59534]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 833.733040] env[59534]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 833.733040] env[59534]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 833.733040] env[59534]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 833.733040] env[59534]: ERROR oslo_vmware.rw_handles response.begin() [ 833.733040] env[59534]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 833.733040] env[59534]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 833.733040] env[59534]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 833.733040] env[59534]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 833.733040] env[59534]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 833.733040] env[59534]: ERROR oslo_vmware.rw_handles [ 833.733816] env[59534]: DEBUG nova.virt.vmwareapi.images [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] Downloaded image file data ca8542a2-3ba7-4624-b2be-cd49a340ac21 to vmware_temp/5fc1d6d7-bec9-4b86-ac9e-cb4d596c1265/ca8542a2-3ba7-4624-b2be-cd49a340ac21/tmp-sparse.vmdk on the data store datastore1 {{(pid=59534) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 833.735014] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] Caching image {{(pid=59534) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 833.735269] env[59534]: DEBUG nova.virt.vmwareapi.vm_util [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Copying Virtual Disk [datastore1] vmware_temp/5fc1d6d7-bec9-4b86-ac9e-cb4d596c1265/ca8542a2-3ba7-4624-b2be-cd49a340ac21/tmp-sparse.vmdk to [datastore1] vmware_temp/5fc1d6d7-bec9-4b86-ac9e-cb4d596c1265/ca8542a2-3ba7-4624-b2be-cd49a340ac21/ca8542a2-3ba7-4624-b2be-cd49a340ac21.vmdk {{(pid=59534) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 833.735560] env[59534]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-27cdcbd6-175c-421b-93e8-db2dbd4940b8 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 833.743622] env[59534]: DEBUG oslo_vmware.api [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Waiting for the task: (returnval){ [ 833.743622] env[59534]: value = "task-1308590" [ 833.743622] env[59534]: _type = "Task" [ 833.743622] env[59534]: } to complete. {{(pid=59534) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 833.753062] env[59534]: DEBUG oslo_vmware.api [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Task: {'id': task-1308590, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59534) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 834.253451] env[59534]: DEBUG oslo_vmware.exceptions [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Fault InvalidArgument not matched. {{(pid=59534) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 834.253751] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Releasing lock "[datastore1] devstack-image-cache_base/ca8542a2-3ba7-4624-b2be-cd49a340ac21/ca8542a2-3ba7-4624-b2be-cd49a340ac21.vmdk" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 834.254354] env[59534]: ERROR nova.compute.manager [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 834.254354] env[59534]: Faults: ['InvalidArgument'] [ 834.254354] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] Traceback (most recent call last): [ 834.254354] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 834.254354] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] yield resources [ 834.254354] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 834.254354] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] self.driver.spawn(context, instance, image_meta, [ 834.254354] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 834.254354] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] self._vmops.spawn(context, instance, image_meta, injected_files, [ 834.254354] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 834.254354] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] self._fetch_image_if_missing(context, vi) [ 834.254354] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 834.254752] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] image_cache(vi, tmp_image_ds_loc) [ 834.254752] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 834.254752] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] vm_util.copy_virtual_disk( [ 834.254752] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 834.254752] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] session._wait_for_task(vmdk_copy_task) [ 834.254752] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 834.254752] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] return self.wait_for_task(task_ref) [ 834.254752] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 834.254752] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] return evt.wait() [ 834.254752] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 834.254752] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] result = hub.switch() [ 834.254752] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 834.254752] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] return self.greenlet.switch() [ 834.255377] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 834.255377] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] self.f(*self.args, **self.kw) [ 834.255377] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 834.255377] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] raise exceptions.translate_fault(task_info.error) [ 834.255377] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 834.255377] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] Faults: ['InvalidArgument'] [ 834.255377] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] [ 834.255377] env[59534]: INFO nova.compute.manager [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] Terminating instance [ 834.257220] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Acquiring lock "refresh_cache-1d6fb105-7087-4bdf-9b1c-b194baf39a55" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 834.257379] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Acquired lock "refresh_cache-1d6fb105-7087-4bdf-9b1c-b194baf39a55" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 834.257543] env[59534]: DEBUG nova.network.neutron [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 834.281662] env[59534]: DEBUG nova.network.neutron [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 834.337360] env[59534]: DEBUG nova.network.neutron [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 834.347059] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Releasing lock "refresh_cache-1d6fb105-7087-4bdf-9b1c-b194baf39a55" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 834.347458] env[59534]: DEBUG nova.compute.manager [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] Start destroying the instance on the hypervisor. {{(pid=59534) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 834.347672] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] Destroying instance {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 834.348696] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c014e058-3a6f-4838-b2e5-53860e955d3f {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 834.356869] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] Unregistering the VM {{(pid=59534) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 834.357082] env[59534]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-2b3c6e4c-fe60-4c30-af96-1865de4acf02 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 834.388811] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] Unregistered the VM {{(pid=59534) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 834.389029] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] Deleting contents of the VM from datastore datastore1 {{(pid=59534) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 834.389213] env[59534]: DEBUG nova.virt.vmwareapi.ds_util [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Deleting the datastore file [datastore1] 1d6fb105-7087-4bdf-9b1c-b194baf39a55 {{(pid=59534) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 834.389446] env[59534]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-60411425-4db6-44dd-b4ef-4fbf908f6fef {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 834.395788] env[59534]: DEBUG oslo_vmware.api [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Waiting for the task: (returnval){ [ 834.395788] env[59534]: value = "task-1308592" [ 834.395788] env[59534]: _type = "Task" [ 834.395788] env[59534]: } to complete. {{(pid=59534) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 834.402972] env[59534]: DEBUG oslo_vmware.api [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Task: {'id': task-1308592, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59534) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 834.905505] env[59534]: DEBUG oslo_vmware.api [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Task: {'id': task-1308592, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.037568} completed successfully. {{(pid=59534) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 834.905901] env[59534]: DEBUG nova.virt.vmwareapi.ds_util [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Deleted the datastore file {{(pid=59534) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 834.905901] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] Deleted contents of the VM from datastore datastore1 {{(pid=59534) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 834.906083] env[59534]: DEBUG nova.virt.vmwareapi.vmops [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] Instance destroyed {{(pid=59534) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 834.906252] env[59534]: INFO nova.compute.manager [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] Took 0.56 seconds to destroy the instance on the hypervisor. [ 834.906488] env[59534]: DEBUG oslo.service.loopingcall [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59534) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 834.906678] env[59534]: DEBUG nova.compute.manager [-] [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] Skipping network deallocation for instance since networking was not requested. {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 834.908801] env[59534]: DEBUG nova.compute.claims [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] Aborting claim: {{(pid=59534) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 834.908961] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 834.909176] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 834.968035] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9820f62b-8318-4345-829c-b30b11f6bb31 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 834.974603] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b067953d-9eb4-44a0-a057-2938746cd1de {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 835.003317] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61f496ee-7d77-4867-bf59-269a85347e46 {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 835.010230] env[59534]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-901ba81f-cc18-484a-aa43-b3b094afee5c {{(pid=59534) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 835.023925] env[59534]: DEBUG nova.compute.provider_tree [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Inventory has not changed in ProviderTree for provider: 7c9b9790-f1a0-47dd-a54c-c74c172308d9 {{(pid=59534) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 835.032052] env[59534]: DEBUG nova.scheduler.client.report [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Inventory has not changed for provider 7c9b9790-f1a0-47dd-a54c-c74c172308d9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59534) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 835.044266] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.135s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 835.044759] env[59534]: ERROR nova.compute.manager [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 835.044759] env[59534]: Faults: ['InvalidArgument'] [ 835.044759] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] Traceback (most recent call last): [ 835.044759] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 835.044759] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] self.driver.spawn(context, instance, image_meta, [ 835.044759] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 835.044759] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] self._vmops.spawn(context, instance, image_meta, injected_files, [ 835.044759] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 835.044759] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] self._fetch_image_if_missing(context, vi) [ 835.044759] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 835.044759] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] image_cache(vi, tmp_image_ds_loc) [ 835.044759] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 835.045208] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] vm_util.copy_virtual_disk( [ 835.045208] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 835.045208] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] session._wait_for_task(vmdk_copy_task) [ 835.045208] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 835.045208] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] return self.wait_for_task(task_ref) [ 835.045208] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 835.045208] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] return evt.wait() [ 835.045208] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 835.045208] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] result = hub.switch() [ 835.045208] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 835.045208] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] return self.greenlet.switch() [ 835.045208] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 835.045208] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] self.f(*self.args, **self.kw) [ 835.045611] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 835.045611] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] raise exceptions.translate_fault(task_info.error) [ 835.045611] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 835.045611] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] Faults: ['InvalidArgument'] [ 835.045611] env[59534]: ERROR nova.compute.manager [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] [ 835.045611] env[59534]: DEBUG nova.compute.utils [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] VimFaultException {{(pid=59534) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 835.046821] env[59534]: DEBUG nova.compute.manager [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] Build of instance 1d6fb105-7087-4bdf-9b1c-b194baf39a55 was re-scheduled: A specified parameter was not correct: fileType [ 835.046821] env[59534]: Faults: ['InvalidArgument'] {{(pid=59534) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 835.047229] env[59534]: DEBUG nova.compute.manager [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] Unplugging VIFs for instance {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 835.047447] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Acquiring lock "refresh_cache-1d6fb105-7087-4bdf-9b1c-b194baf39a55" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 835.047587] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Acquired lock "refresh_cache-1d6fb105-7087-4bdf-9b1c-b194baf39a55" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 835.047738] env[59534]: DEBUG nova.network.neutron [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] Building network info cache for instance {{(pid=59534) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 835.069443] env[59534]: DEBUG nova.network.neutron [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] Instance cache missing network info. {{(pid=59534) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 835.126725] env[59534]: DEBUG nova.network.neutron [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] Updating instance_info_cache with network_info: [] {{(pid=59534) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 835.136042] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Releasing lock "refresh_cache-1d6fb105-7087-4bdf-9b1c-b194baf39a55" {{(pid=59534) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 835.136257] env[59534]: DEBUG nova.compute.manager [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59534) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 835.136433] env[59534]: DEBUG nova.compute.manager [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] [instance: 1d6fb105-7087-4bdf-9b1c-b194baf39a55] Skipping network deallocation for instance since networking was not requested. {{(pid=59534) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 835.213579] env[59534]: INFO nova.scheduler.client.report [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Deleted allocations for instance 1d6fb105-7087-4bdf-9b1c-b194baf39a55 [ 835.227902] env[59534]: DEBUG oslo_concurrency.lockutils [None req-8401108f-42ae-487f-9b5a-e99f50239a15 tempest-ServerShowV247Test-312332432 tempest-ServerShowV247Test-312332432-project-member] Lock "1d6fb105-7087-4bdf-9b1c-b194baf39a55" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 117.549s {{(pid=59534) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}}