[ 543.742731] env[60121]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 544.191883] env[60164]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 545.750959] env[60164]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=60164) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 545.751337] env[60164]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=60164) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 545.751430] env[60164]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=60164) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 545.751656] env[60164]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs [ 545.752714] env[60164]: WARNING nova.servicegroup.api [-] Report interval must be less than service down time. Current config: . Setting service_down_time to: 300 [ 545.867066] env[60164]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=60164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} [ 545.876803] env[60164]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.010s {{(pid=60164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} [ 545.979013] env[60164]: INFO nova.virt.driver [None req-bb0451cf-e927-484d-bf66-54d1430d7f5f None None] Loading compute driver 'vmwareapi.VMwareVCDriver' [ 546.050197] env[60164]: DEBUG oslo_concurrency.lockutils [-] Acquiring lock "oslo_vmware_api_lock" by "oslo_vmware.api.VMwareAPISession._create_session" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 546.050325] env[60164]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" acquired by "oslo_vmware.api.VMwareAPISession._create_session" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 546.050409] env[60164]: DEBUG oslo_vmware.service [-] Creating suds client with soap_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk' and wsdl_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk/vimService.wsdl' {{(pid=60164) __init__ /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:242}} [ 549.260035] env[60164]: DEBUG oslo_vmware.service [-] Invoking ServiceInstance.RetrieveServiceContent with opID=oslo.vmware-8904fb63-65cb-4ffe-a0f2-421807b98a17 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 549.275476] env[60164]: DEBUG oslo_vmware.api [-] Logging into host: vc1.osci.c.eu-de-1.cloud.sap. {{(pid=60164) _create_session /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:242}} [ 549.275677] env[60164]: DEBUG oslo_vmware.service [-] Invoking SessionManager.Login with opID=oslo.vmware-0fc3ede1-d301-47f2-8a1e-621f8c12de77 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 549.310065] env[60164]: INFO oslo_vmware.api [-] Successfully established new session; session ID is 5379e. [ 549.310294] env[60164]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" "released" by "oslo_vmware.api.VMwareAPISession._create_session" :: held 3.260s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 549.310837] env[60164]: INFO nova.virt.vmwareapi.driver [None req-bb0451cf-e927-484d-bf66-54d1430d7f5f None None] VMware vCenter version: 7.0.3 [ 549.314435] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0fc904b7-eb2d-445e-94cc-de9b15d50cf6 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 549.331947] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a53c608-3f33-4cb5-ac72-763470e335c9 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 549.337924] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02206d2d-9016-4e5d-b26d-da59a2c33770 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 549.344481] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f689e23-f3ce-4fbc-816a-2f45390db0f0 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 549.357422] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bac00ba6-527d-49ec-8977-b18a17df6aa4 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 549.363464] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-253c0540-04f3-4dd0-b230-c1d2976ff580 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 549.394584] env[60164]: DEBUG oslo_vmware.service [-] Invoking ExtensionManager.FindExtension with opID=oslo.vmware-f639f9b5-aed6-4116-8b3b-657dbc802bdc {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 549.399816] env[60164]: DEBUG nova.virt.vmwareapi.driver [None req-bb0451cf-e927-484d-bf66-54d1430d7f5f None None] Extension org.openstack.compute already exists. {{(pid=60164) _register_openstack_extension /opt/stack/nova/nova/virt/vmwareapi/driver.py:214}} [ 549.402724] env[60164]: INFO nova.compute.provider_config [None req-bb0451cf-e927-484d-bf66-54d1430d7f5f None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. [ 549.421858] env[60164]: DEBUG nova.context [None req-bb0451cf-e927-484d-bf66-54d1430d7f5f None None] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),b5105805-148c-4545-8985-397f7b32e247(cell1) {{(pid=60164) load_cells /opt/stack/nova/nova/context.py:464}} [ 549.423811] env[60164]: DEBUG oslo_concurrency.lockutils [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 549.424039] env[60164]: DEBUG oslo_concurrency.lockutils [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 549.424757] env[60164]: DEBUG oslo_concurrency.lockutils [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 549.425113] env[60164]: DEBUG oslo_concurrency.lockutils [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] Acquiring lock "b5105805-148c-4545-8985-397f7b32e247" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 549.425299] env[60164]: DEBUG oslo_concurrency.lockutils [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] Lock "b5105805-148c-4545-8985-397f7b32e247" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 549.426252] env[60164]: DEBUG oslo_concurrency.lockutils [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] Lock "b5105805-148c-4545-8985-397f7b32e247" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 549.439474] env[60164]: DEBUG oslo_db.sqlalchemy.engines [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=60164) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 549.439847] env[60164]: DEBUG oslo_db.sqlalchemy.engines [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=60164) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 549.446551] env[60164]: ERROR nova.db.main.api [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] No DB access allowed in nova-compute: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 549.446551] env[60164]: result = function(*args, **kwargs) [ 549.446551] env[60164]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 549.446551] env[60164]: return func(*args, **kwargs) [ 549.446551] env[60164]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 549.446551] env[60164]: result = fn(*args, **kwargs) [ 549.446551] env[60164]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 549.446551] env[60164]: return f(*args, **kwargs) [ 549.446551] env[60164]: File "/opt/stack/nova/nova/objects/service.py", line 546, in _db_service_get_minimum_version [ 549.446551] env[60164]: return db.service_get_minimum_version(context, binaries) [ 549.446551] env[60164]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 549.446551] env[60164]: _check_db_access() [ 549.446551] env[60164]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 549.446551] env[60164]: stacktrace = ''.join(traceback.format_stack()) [ 549.446551] env[60164]: [ 549.447561] env[60164]: ERROR nova.db.main.api [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] No DB access allowed in nova-compute: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 549.447561] env[60164]: result = function(*args, **kwargs) [ 549.447561] env[60164]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 549.447561] env[60164]: return func(*args, **kwargs) [ 549.447561] env[60164]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 549.447561] env[60164]: result = fn(*args, **kwargs) [ 549.447561] env[60164]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 549.447561] env[60164]: return f(*args, **kwargs) [ 549.447561] env[60164]: File "/opt/stack/nova/nova/objects/service.py", line 546, in _db_service_get_minimum_version [ 549.447561] env[60164]: return db.service_get_minimum_version(context, binaries) [ 549.447561] env[60164]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 549.447561] env[60164]: _check_db_access() [ 549.447561] env[60164]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 549.447561] env[60164]: stacktrace = ''.join(traceback.format_stack()) [ 549.447561] env[60164]: [ 549.447975] env[60164]: WARNING nova.objects.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] Failed to get minimum service version for cell b5105805-148c-4545-8985-397f7b32e247 [ 549.448044] env[60164]: WARNING nova.objects.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] Failed to get minimum service version for cell 00000000-0000-0000-0000-000000000000 [ 549.448481] env[60164]: DEBUG oslo_concurrency.lockutils [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] Acquiring lock "singleton_lock" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 549.448638] env[60164]: DEBUG oslo_concurrency.lockutils [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] Acquired lock "singleton_lock" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 549.448874] env[60164]: DEBUG oslo_concurrency.lockutils [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] Releasing lock "singleton_lock" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 549.449204] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] Full set of CONF: {{(pid=60164) _wait_for_exit_or_signal /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:362}} [ 549.449343] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] ******************************************************************************** {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2589}} [ 549.449470] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] Configuration options gathered from: {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2590}} [ 549.449603] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] command line args: ['--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-cpu-common.conf', '--config-file', '/etc/nova/nova-cpu-1.conf'] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2591}} [ 549.449793] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2592}} [ 549.449917] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] ================================================================================ {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2594}} [ 549.450131] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] allow_resize_to_same_host = True {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.450298] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] arq_binding_timeout = 300 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.450427] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] backdoor_port = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.450548] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] backdoor_socket = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.450703] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] block_device_allocate_retries = 60 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.450858] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] block_device_allocate_retries_interval = 3 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.451046] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cert = self.pem {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.451193] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] compute_driver = vmwareapi.VMwareVCDriver {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.451355] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] compute_monitors = [] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.451517] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] config_dir = [] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.451684] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] config_drive_format = iso9660 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.451819] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.451978] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] config_source = [] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.452152] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] console_host = devstack {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.452310] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] control_exchange = nova {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.452464] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cpu_allocation_ratio = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.452618] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] daemon = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.452780] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] debug = True {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.452929] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] default_access_ip_network_name = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.453101] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] default_availability_zone = nova {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.453252] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] default_ephemeral_format = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.453478] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.453636] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] default_schedule_zone = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.453787] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] disk_allocation_ratio = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.453939] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] enable_new_services = True {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.454124] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] enabled_apis = ['osapi_compute'] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.454285] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] enabled_ssl_apis = [] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.454438] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] flat_injected = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.454592] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] force_config_drive = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.454743] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] force_raw_images = True {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.454902] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] graceful_shutdown_timeout = 5 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.455064] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] heal_instance_info_cache_interval = 60 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.455276] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] host = cpu-1 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.455441] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] initial_cpu_allocation_ratio = 4.0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.455635] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] initial_disk_allocation_ratio = 1.0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.455755] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] initial_ram_allocation_ratio = 1.0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.455964] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.456135] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] instance_build_timeout = 0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.456290] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] instance_delete_interval = 300 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.456450] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] instance_format = [instance: %(uuid)s] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.456610] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] instance_name_template = instance-%08x {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.456764] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] instance_usage_audit = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.456923] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] instance_usage_audit_period = month {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.457089] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.457250] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] instances_path = /opt/stack/data/nova/instances {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.457437] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] internal_service_availability_zone = internal {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.457598] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] key = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.457754] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] live_migration_retry_count = 30 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.457909] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] log_config_append = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.458125] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.458347] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] log_dir = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.458531] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] log_file = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.458660] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] log_options = True {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.458823] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] log_rotate_interval = 1 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.458985] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] log_rotate_interval_type = days {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.459163] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] log_rotation_type = none {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.459290] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.459415] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.459576] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.459733] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.459857] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.460020] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] long_rpc_timeout = 1800 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.460176] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] max_concurrent_builds = 10 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.460327] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] max_concurrent_live_migrations = 1 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.460518] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] max_concurrent_snapshots = 5 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.460684] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] max_local_block_devices = 3 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.460837] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] max_logfile_count = 30 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.460991] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] max_logfile_size_mb = 200 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.461160] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] maximum_instance_delete_attempts = 5 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.461321] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] metadata_listen = 0.0.0.0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.461484] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] metadata_listen_port = 8775 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.461645] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] metadata_workers = 2 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.461798] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] migrate_max_retries = -1 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.461959] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] mkisofs_cmd = genisoimage {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.462174] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] my_block_storage_ip = 10.180.1.21 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.462301] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] my_ip = 10.180.1.21 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.462463] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] network_allocate_retries = 0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.462633] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.462794] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] osapi_compute_listen = 0.0.0.0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.462951] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] osapi_compute_listen_port = 8774 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.463121] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] osapi_compute_unique_server_name_scope = {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.463285] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] osapi_compute_workers = 2 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.463443] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] password_length = 12 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.463599] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] periodic_enable = True {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.463756] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] periodic_fuzzy_delay = 60 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.463918] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] pointer_model = usbtablet {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.464089] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] preallocate_images = none {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.464245] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] publish_errors = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.464372] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] pybasedir = /opt/stack/nova {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.464526] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] ram_allocation_ratio = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.464681] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] rate_limit_burst = 0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.464840] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] rate_limit_except_level = CRITICAL {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.464996] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] rate_limit_interval = 0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.465163] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] reboot_timeout = 0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.465318] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] reclaim_instance_interval = 0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.465467] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] record = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.465627] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] reimage_timeout_per_gb = 60 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.465787] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] report_interval = 120 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.465941] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] rescue_timeout = 0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.466108] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] reserved_host_cpus = 0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.466263] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] reserved_host_disk_mb = 0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.466414] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] reserved_host_memory_mb = 512 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.466569] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] reserved_huge_pages = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.466719] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] resize_confirm_window = 0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.466873] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] resize_fs_using_block_device = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.467033] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] resume_guests_state_on_host_boot = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.467197] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.467387] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] rpc_response_timeout = 60 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.467563] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] run_external_periodic_tasks = True {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.467735] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] running_deleted_instance_action = reap {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.467885] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] running_deleted_instance_poll_interval = 1800 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.468051] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] running_deleted_instance_timeout = 0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.468210] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] scheduler_instance_sync_interval = 120 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.468361] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] service_down_time = 300 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.468540] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] servicegroup_driver = db {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.468699] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] shelved_offload_time = 0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.468854] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] shelved_poll_interval = 3600 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.469022] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] shutdown_timeout = 0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.469184] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] source_is_ipv6 = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.469339] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] ssl_only = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.469579] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] state_path = /opt/stack/data/n-cpu-1 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.469768] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] sync_power_state_interval = 600 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.469935] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] sync_power_state_pool_size = 1000 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.470111] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] syslog_log_facility = LOG_USER {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.470266] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] tempdir = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.470422] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] timeout_nbd = 10 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.470584] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] transport_url = **** {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.470739] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] update_resources_interval = 0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.470895] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] use_cow_images = True {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.471058] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] use_eventlog = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.471217] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] use_journal = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.471369] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] use_json = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.471525] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] use_rootwrap_daemon = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.471678] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] use_stderr = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.471830] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] use_syslog = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.471979] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vcpu_pin_set = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.472154] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vif_plugging_is_fatal = True {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.472316] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vif_plugging_timeout = 300 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.472476] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] virt_mkfs = [] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.472632] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] volume_usage_poll_interval = 0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.472787] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] watch_log_file = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.472948] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] web = /usr/share/spice-html5 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 549.473141] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_concurrency.disable_process_locking = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.473429] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_concurrency.lock_path = /opt/stack/data/n-cpu-1 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.473609] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.473775] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.473942] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_messaging_metrics.metrics_process_name = {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.474119] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.474281] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.474459] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] api.auth_strategy = keystone {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.474619] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] api.compute_link_prefix = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.474788] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.474952] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] api.dhcp_domain = novalocal {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.475126] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] api.enable_instance_password = True {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.475285] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] api.glance_link_prefix = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.475445] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.475612] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] api.instance_list_cells_batch_strategy = distributed {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.475769] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] api.instance_list_per_project_cells = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.475925] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] api.list_records_by_skipping_down_cells = True {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.476092] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] api.local_metadata_per_cell = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.476257] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] api.max_limit = 1000 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.476418] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] api.metadata_cache_expiration = 15 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.476594] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] api.neutron_default_tenant_id = default {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.476738] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] api.use_forwarded_for = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.476899] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] api.use_neutron_default_nets = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.477072] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.477233] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] api.vendordata_dynamic_failure_fatal = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.477413] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.477587] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] api.vendordata_dynamic_ssl_certfile = {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.477753] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] api.vendordata_dynamic_targets = [] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.477912] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] api.vendordata_jsonfile_path = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.478100] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] api.vendordata_providers = ['StaticJSON'] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.478291] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cache.backend = dogpile.cache.memcached {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.478474] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cache.backend_argument = **** {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.478648] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cache.config_prefix = cache.oslo {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.478816] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cache.dead_timeout = 60.0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.478978] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cache.debug_cache_backend = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.479151] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cache.enable_retry_client = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.479310] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cache.enable_socket_keepalive = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.479480] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cache.enabled = True {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.479660] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cache.expiration_time = 600 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.479841] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cache.hashclient_retry_attempts = 2 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.480025] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cache.hashclient_retry_delay = 1.0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.480192] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cache.memcache_dead_retry = 300 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.480360] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cache.memcache_password = {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.480523] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.480681] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.480839] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cache.memcache_pool_maxsize = 10 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.480998] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cache.memcache_pool_unused_timeout = 60 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.481171] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cache.memcache_sasl_enabled = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.481347] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cache.memcache_servers = ['localhost:11211'] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.481511] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cache.memcache_socket_timeout = 1.0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.481674] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cache.memcache_username = {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.481836] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cache.proxies = [] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.481997] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cache.retry_attempts = 2 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.482173] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cache.retry_delay = 0.0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.482334] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cache.socket_keepalive_count = 1 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.482493] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cache.socket_keepalive_idle = 1 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.482649] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cache.socket_keepalive_interval = 1 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.482801] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cache.tls_allowed_ciphers = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.482954] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cache.tls_cafile = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.483119] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cache.tls_certfile = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.483281] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cache.tls_enabled = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.483438] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cache.tls_keyfile = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.483608] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cinder.auth_section = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.483776] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cinder.auth_type = password {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.483933] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cinder.cafile = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.484112] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cinder.catalog_info = volumev3::publicURL {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.484271] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cinder.certfile = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.484430] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cinder.collect_timing = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.484589] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cinder.cross_az_attach = True {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.484745] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cinder.debug = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.484900] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cinder.endpoint_template = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.485069] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cinder.http_retries = 3 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.485231] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cinder.insecure = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.485386] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cinder.keyfile = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.485550] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cinder.os_region_name = RegionOne {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.485722] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cinder.split_loggers = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.485939] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cinder.timeout = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.486134] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.486296] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] compute.cpu_dedicated_set = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.486458] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] compute.cpu_shared_set = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.486624] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] compute.image_type_exclude_list = [] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.486786] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] compute.live_migration_wait_for_vif_plug = True {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.486945] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] compute.max_concurrent_disk_ops = 0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.487124] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] compute.max_disk_devices_to_attach = -1 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.487292] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.487480] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.487650] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] compute.resource_provider_association_refresh = 300 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.487810] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] compute.shutdown_retry_interval = 10 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.487985] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.488176] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] conductor.workers = 2 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.488374] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] console.allowed_origins = [] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.488545] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] console.ssl_ciphers = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.488718] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] console.ssl_minimum_version = default {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.488890] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] consoleauth.token_ttl = 600 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.489076] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cyborg.cafile = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.489237] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cyborg.certfile = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.489402] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cyborg.collect_timing = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.489562] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cyborg.connect_retries = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.489747] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cyborg.connect_retry_delay = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.489914] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cyborg.endpoint_override = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.490090] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cyborg.insecure = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.490251] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cyborg.keyfile = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.490409] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cyborg.max_version = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.490569] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cyborg.min_version = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.490725] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cyborg.region_name = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.490881] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cyborg.service_name = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.491058] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cyborg.service_type = accelerator {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.491224] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cyborg.split_loggers = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.491382] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cyborg.status_code_retries = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.491542] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cyborg.status_code_retry_delay = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.491707] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cyborg.timeout = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.491876] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.492046] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] cyborg.version = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.492229] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] database.backend = sqlalchemy {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.492403] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] database.connection = **** {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.492571] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] database.connection_debug = 0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.492737] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] database.connection_parameters = {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.492902] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] database.connection_recycle_time = 3600 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.493078] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] database.connection_trace = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.493242] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] database.db_inc_retry_interval = True {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.493404] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] database.db_max_retries = 20 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.493566] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] database.db_max_retry_interval = 10 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.493725] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] database.db_retry_interval = 1 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.493889] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] database.max_overflow = 50 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.494056] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] database.max_pool_size = 5 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.494233] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] database.max_retries = 10 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.494398] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] database.mysql_enable_ndb = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.494604] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] database.mysql_sql_mode = TRADITIONAL {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.494723] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] database.mysql_wsrep_sync_wait = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.494883] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] database.pool_timeout = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.495060] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] database.retry_interval = 10 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.495219] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] database.slave_connection = **** {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.495389] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] database.sqlite_synchronous = True {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.495551] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] database.use_db_reconnect = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.495728] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] api_database.backend = sqlalchemy {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.496149] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] api_database.connection = **** {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.496341] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] api_database.connection_debug = 0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.496520] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] api_database.connection_parameters = {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.496689] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] api_database.connection_recycle_time = 3600 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.496859] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] api_database.connection_trace = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.497033] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] api_database.db_inc_retry_interval = True {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.497204] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] api_database.db_max_retries = 20 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.497389] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] api_database.db_max_retry_interval = 10 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.497566] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] api_database.db_retry_interval = 1 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.497738] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] api_database.max_overflow = 50 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.497903] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] api_database.max_pool_size = 5 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.498085] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] api_database.max_retries = 10 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.498253] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] api_database.mysql_enable_ndb = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.498448] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.498621] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] api_database.mysql_wsrep_sync_wait = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.498785] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] api_database.pool_timeout = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.498954] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] api_database.retry_interval = 10 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.499126] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] api_database.slave_connection = **** {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.499293] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] api_database.sqlite_synchronous = True {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.499469] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] devices.enabled_mdev_types = [] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.499668] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.499850] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] ephemeral_storage_encryption.enabled = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.500030] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] ephemeral_storage_encryption.key_size = 512 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.500203] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] glance.api_servers = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.500367] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] glance.cafile = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.500531] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] glance.certfile = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.500695] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] glance.collect_timing = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.500855] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] glance.connect_retries = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.501023] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] glance.connect_retry_delay = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.501190] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] glance.debug = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.501356] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] glance.default_trusted_certificate_ids = [] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.501521] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] glance.enable_certificate_validation = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.501688] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] glance.enable_rbd_download = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.501891] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] glance.endpoint_override = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.502079] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] glance.insecure = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.502246] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] glance.keyfile = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.502406] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] glance.max_version = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.502567] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] glance.min_version = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.502729] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] glance.num_retries = 3 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.502899] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] glance.rbd_ceph_conf = {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.503073] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] glance.rbd_connect_timeout = 5 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.503244] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] glance.rbd_pool = {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.503411] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] glance.rbd_user = {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.503573] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] glance.region_name = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.503764] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] glance.service_name = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.503935] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] glance.service_type = image {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.504110] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] glance.split_loggers = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.504271] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] glance.status_code_retries = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.504429] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] glance.status_code_retry_delay = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.504590] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] glance.timeout = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.504770] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.504944] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] glance.verify_glance_signatures = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.505118] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] glance.version = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.505370] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] guestfs.debug = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.505576] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] hyperv.config_drive_cdrom = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.505749] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] hyperv.config_drive_inject_password = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.505920] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.506098] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] hyperv.enable_instance_metrics_collection = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.506267] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] hyperv.enable_remotefx = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.506440] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] hyperv.instances_path_share = {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.506685] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] hyperv.iscsi_initiator_list = [] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.506862] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] hyperv.limit_cpu_features = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.507040] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.507233] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.507458] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] hyperv.power_state_check_timeframe = 60 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.507609] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] hyperv.power_state_event_polling_interval = 2 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.507780] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.507945] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] hyperv.use_multipath_io = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.508120] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] hyperv.volume_attach_retry_count = 10 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.508283] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] hyperv.volume_attach_retry_interval = 5 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.508439] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] hyperv.vswitch_name = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.508600] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.508769] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] mks.enabled = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.509132] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.509327] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] image_cache.manager_interval = 2400 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.509498] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] image_cache.precache_concurrency = 1 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.509688] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] image_cache.remove_unused_base_images = True {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.509870] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.510078] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.510229] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] image_cache.subdirectory_name = _base {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.510409] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] ironic.api_max_retries = 60 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.510576] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] ironic.api_retry_interval = 2 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.510738] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] ironic.auth_section = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.510898] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] ironic.auth_type = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.511070] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] ironic.cafile = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.511234] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] ironic.certfile = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.511400] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] ironic.collect_timing = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.511562] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] ironic.connect_retries = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.511720] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] ironic.connect_retry_delay = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.511878] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] ironic.endpoint_override = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.512049] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] ironic.insecure = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.512213] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] ironic.keyfile = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.512373] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] ironic.max_version = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.512532] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] ironic.min_version = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.512702] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] ironic.partition_key = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.512860] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] ironic.peer_list = [] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.513029] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] ironic.region_name = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.513195] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] ironic.serial_console_state_timeout = 10 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.513355] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] ironic.service_name = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.513525] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] ironic.service_type = baremetal {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.513687] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] ironic.split_loggers = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.513846] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] ironic.status_code_retries = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.514008] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] ironic.status_code_retry_delay = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.514176] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] ironic.timeout = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.514361] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.514528] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] ironic.version = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.514709] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.514881] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] key_manager.fixed_key = **** {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.515074] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.515238] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] barbican.barbican_api_version = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.515398] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] barbican.barbican_endpoint = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.515570] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] barbican.barbican_endpoint_type = public {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.515728] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] barbican.barbican_region_name = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.515885] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] barbican.cafile = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.516050] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] barbican.certfile = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.516214] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] barbican.collect_timing = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.516375] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] barbican.insecure = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.516531] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] barbican.keyfile = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.516692] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] barbican.number_of_retries = 60 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.516928] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] barbican.retry_delay = 1 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.517023] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] barbican.send_service_user_token = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.517187] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] barbican.split_loggers = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.517367] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] barbican.timeout = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.517545] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] barbican.verify_ssl = True {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.517706] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] barbican.verify_ssl_path = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.517870] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] barbican_service_user.auth_section = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.518043] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] barbican_service_user.auth_type = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.518207] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] barbican_service_user.cafile = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.518383] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] barbican_service_user.certfile = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.518558] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] barbican_service_user.collect_timing = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.518724] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] barbican_service_user.insecure = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.518882] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] barbican_service_user.keyfile = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.519056] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] barbican_service_user.split_loggers = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.519720] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] barbican_service_user.timeout = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.519720] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vault.approle_role_id = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.519720] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vault.approle_secret_id = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.519720] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vault.cafile = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.519866] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vault.certfile = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.520047] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vault.collect_timing = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.520214] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vault.insecure = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.520371] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vault.keyfile = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.520544] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vault.kv_mountpoint = secret {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.520713] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vault.kv_version = 2 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.520882] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vault.namespace = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.521056] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vault.root_token_id = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.521224] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vault.split_loggers = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.521383] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vault.ssl_ca_crt_file = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.521543] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vault.timeout = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.521704] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vault.use_ssl = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.521871] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.522047] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] keystone.cafile = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.522210] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] keystone.certfile = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.522376] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] keystone.collect_timing = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.522538] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] keystone.connect_retries = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.522698] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] keystone.connect_retry_delay = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.522853] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] keystone.endpoint_override = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.523023] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] keystone.insecure = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.523187] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] keystone.keyfile = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.523344] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] keystone.max_version = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.523503] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] keystone.min_version = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.523667] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] keystone.region_name = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.523823] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] keystone.service_name = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.523991] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] keystone.service_type = identity {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.524165] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] keystone.split_loggers = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.524325] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] keystone.status_code_retries = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.524486] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] keystone.status_code_retry_delay = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.524643] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] keystone.timeout = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.524821] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.524981] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] keystone.version = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.525190] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.connection_uri = {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.525355] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.cpu_mode = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.525525] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.cpu_model_extra_flags = [] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.525692] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.cpu_models = [] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.525862] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.cpu_power_governor_high = performance {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.526040] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.cpu_power_governor_low = powersave {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.526207] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.cpu_power_management = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.526380] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.526543] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.device_detach_attempts = 8 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.526705] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.device_detach_timeout = 20 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.526870] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.disk_cachemodes = [] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.527037] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.disk_prefix = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.527205] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.enabled_perf_events = [] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.527392] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.file_backed_memory = 0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.527571] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.gid_maps = [] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.527730] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.hw_disk_discard = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.527889] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.hw_machine_type = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.528070] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.images_rbd_ceph_conf = {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.528238] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.528427] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.528601] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.images_rbd_glance_store_name = {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.528769] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.images_rbd_pool = rbd {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.528936] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.images_type = default {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.529111] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.images_volume_group = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.529279] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.inject_key = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.529442] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.inject_partition = -2 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.529609] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.inject_password = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.529798] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.iscsi_iface = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.529968] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.iser_use_multipath = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.530146] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.live_migration_bandwidth = 0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.530312] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.live_migration_completion_timeout = 800 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.530476] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.live_migration_downtime = 500 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.530637] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.live_migration_downtime_delay = 75 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.530797] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.live_migration_downtime_steps = 10 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.530958] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.live_migration_inbound_addr = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.531133] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.live_migration_permit_auto_converge = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.531337] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.live_migration_permit_post_copy = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.531458] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.live_migration_scheme = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.531634] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.live_migration_timeout_action = abort {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.531800] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.live_migration_tunnelled = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.531966] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.live_migration_uri = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.532133] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.live_migration_with_native_tls = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.532292] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.max_queues = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.532453] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.mem_stats_period_seconds = 10 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.532736] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.nfs_mount_options = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.532920] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.nfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.533104] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.num_aoe_discover_tries = 3 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.533273] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.num_iser_scan_tries = 5 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.533434] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.num_memory_encrypted_guests = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.533600] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.num_nvme_discover_tries = 5 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.533760] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.num_pcie_ports = 0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.534013] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.num_volume_scan_tries = 5 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.534237] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.pmem_namespaces = [] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.534410] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.quobyte_client_cfg = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.534701] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.quobyte_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.534877] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.rbd_connect_timeout = 5 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.535055] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.535225] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.535390] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.rbd_secret_uuid = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.535550] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.rbd_user = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.535714] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.realtime_scheduler_priority = 1 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.535884] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.remote_filesystem_transport = ssh {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.536052] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.rescue_image_id = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.536211] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.rescue_kernel_id = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.536367] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.rescue_ramdisk_id = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.536537] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.rng_dev_path = /dev/urandom {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.536695] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.rx_queue_size = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.536860] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.smbfs_mount_options = {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.537143] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.smbfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.537378] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.snapshot_compression = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.537509] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.snapshot_image_format = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.537730] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.537900] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.sparse_logical_volumes = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.538076] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.swtpm_enabled = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.538250] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.swtpm_group = tss {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.538416] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.swtpm_user = tss {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.538585] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.sysinfo_serial = unique {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.538859] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.tx_queue_size = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.538911] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.uid_maps = [] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.539074] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.use_virtio_for_bridges = True {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.539250] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.virt_type = kvm {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.539418] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.volume_clear = zero {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.539582] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.volume_clear_size = 0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.539776] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.volume_use_multipath = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.539941] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.vzstorage_cache_path = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.540124] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.540294] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.vzstorage_mount_group = qemu {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.540460] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.vzstorage_mount_opts = [] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.540626] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.540910] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.541097] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.vzstorage_mount_user = stack {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.541265] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.541437] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] neutron.auth_section = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.541610] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] neutron.auth_type = password {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.541768] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] neutron.cafile = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.541934] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] neutron.certfile = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.542106] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] neutron.collect_timing = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.542263] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] neutron.connect_retries = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.542422] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] neutron.connect_retry_delay = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.542590] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] neutron.default_floating_pool = public {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.542749] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] neutron.endpoint_override = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.542912] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] neutron.extension_sync_interval = 600 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.543084] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] neutron.http_retries = 3 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.543249] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] neutron.insecure = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.543406] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] neutron.keyfile = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.543568] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] neutron.max_version = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.543730] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] neutron.metadata_proxy_shared_secret = **** {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.543887] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] neutron.min_version = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.544063] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] neutron.ovs_bridge = br-int {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.544234] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] neutron.physnets = [] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.544397] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] neutron.region_name = RegionOne {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.544566] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] neutron.service_metadata_proxy = True {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.544726] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] neutron.service_name = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.544892] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] neutron.service_type = network {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.545067] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] neutron.split_loggers = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.545227] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] neutron.status_code_retries = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.545383] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] neutron.status_code_retry_delay = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.545542] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] neutron.timeout = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.545722] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.545917] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] neutron.version = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.546107] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] notifications.bdms_in_notifications = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.546287] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] notifications.default_level = INFO {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.546461] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] notifications.notification_format = unversioned {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.546624] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] notifications.notify_on_state_change = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.546798] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.546973] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] pci.alias = [] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.547154] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] pci.device_spec = [] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.547557] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] pci.report_in_placement = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.547557] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] placement.auth_section = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.547691] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] placement.auth_type = password {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.547857] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] placement.auth_url = http://10.180.1.21/identity {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.548028] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] placement.cafile = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.548190] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] placement.certfile = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.548371] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] placement.collect_timing = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.548541] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] placement.connect_retries = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.548700] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] placement.connect_retry_delay = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.548859] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] placement.default_domain_id = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.549025] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] placement.default_domain_name = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.549184] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] placement.domain_id = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.549342] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] placement.domain_name = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.549499] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] placement.endpoint_override = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.549683] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] placement.insecure = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.549859] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] placement.keyfile = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.550026] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] placement.max_version = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.550186] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] placement.min_version = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.550354] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] placement.password = **** {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.550515] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] placement.project_domain_id = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.550679] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] placement.project_domain_name = Default {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.550842] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] placement.project_id = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.551023] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] placement.project_name = service {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.551188] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] placement.region_name = RegionOne {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.551347] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] placement.service_name = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.551516] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] placement.service_type = placement {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.551678] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] placement.split_loggers = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.551832] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] placement.status_code_retries = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.551988] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] placement.status_code_retry_delay = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.552158] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] placement.system_scope = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.552314] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] placement.timeout = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.552482] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] placement.trust_id = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.552632] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] placement.user_domain_id = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.552797] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] placement.user_domain_name = Default {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.552962] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] placement.user_id = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.553155] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] placement.username = placement {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.553336] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.553497] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] placement.version = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.553671] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] quota.cores = 20 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.553835] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] quota.count_usage_from_placement = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.554010] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.554188] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] quota.injected_file_content_bytes = 10240 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.554353] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] quota.injected_file_path_length = 255 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.554519] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] quota.injected_files = 5 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.554681] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] quota.instances = 10 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.554842] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] quota.key_pairs = 100 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.555024] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] quota.metadata_items = 128 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.555184] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] quota.ram = 51200 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.555350] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] quota.recheck_quota = True {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.555514] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] quota.server_group_members = 10 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.555677] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] quota.server_groups = 10 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.555845] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] rdp.enabled = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.556174] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.556360] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.556530] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.556694] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] scheduler.image_metadata_prefilter = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.556857] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.557031] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] scheduler.max_attempts = 3 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.557197] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] scheduler.max_placement_results = 1000 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.557388] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.557561] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] scheduler.query_placement_for_availability_zone = True {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.557724] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] scheduler.query_placement_for_image_type_support = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.557885] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.558068] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] scheduler.workers = 2 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.558247] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.558439] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.558627] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.558799] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.558969] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.559149] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.559312] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.559502] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.559755] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] filter_scheduler.host_subset_size = 1 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.559859] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] filter_scheduler.image_properties_default_architecture = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.560034] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.560226] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] filter_scheduler.isolated_hosts = [] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.560371] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] filter_scheduler.isolated_images = [] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.560535] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] filter_scheduler.max_instances_per_host = 50 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.560696] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.560866] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] filter_scheduler.pci_in_placement = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.561027] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.561194] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.561357] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.561520] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.561817] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.561907] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.562011] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] filter_scheduler.track_instance_changes = True {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.562192] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.562367] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] metrics.required = True {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.562522] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] metrics.weight_multiplier = 1.0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.562684] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] metrics.weight_of_unavailable = -10000.0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.562848] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] metrics.weight_setting = [] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.563178] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.563358] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] serial_console.enabled = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.563537] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] serial_console.port_range = 10000:20000 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.563708] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.563877] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.564062] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] serial_console.serialproxy_port = 6083 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.564233] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] service_user.auth_section = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.564407] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] service_user.auth_type = password {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.564569] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] service_user.cafile = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.564845] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] service_user.certfile = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.564880] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] service_user.collect_timing = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.565046] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] service_user.insecure = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.565247] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] service_user.keyfile = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.565378] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] service_user.send_service_user_token = True {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.565542] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] service_user.split_loggers = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.565700] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] service_user.timeout = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.565870] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] spice.agent_enabled = True {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.566056] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] spice.enabled = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.566359] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.566556] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] spice.html5proxy_host = 0.0.0.0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.566727] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] spice.html5proxy_port = 6082 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.566887] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] spice.image_compression = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.567055] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] spice.jpeg_compression = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.567218] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] spice.playback_compression = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.567416] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] spice.server_listen = 127.0.0.1 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.567625] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.567747] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] spice.streaming_mode = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.567951] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] spice.zlib_compression = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.568077] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] upgrade_levels.baseapi = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.568251] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] upgrade_levels.cert = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.568421] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] upgrade_levels.compute = auto {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.568586] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] upgrade_levels.conductor = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.568745] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] upgrade_levels.scheduler = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.568909] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vendordata_dynamic_auth.auth_section = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.569101] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vendordata_dynamic_auth.auth_type = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.569236] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vendordata_dynamic_auth.cafile = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.569391] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vendordata_dynamic_auth.certfile = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.569549] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vendordata_dynamic_auth.collect_timing = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.569736] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vendordata_dynamic_auth.insecure = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.569904] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vendordata_dynamic_auth.keyfile = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.570100] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vendordata_dynamic_auth.split_loggers = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.570267] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vendordata_dynamic_auth.timeout = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.570441] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vmware.api_retry_count = 10 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.570604] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vmware.ca_file = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.570775] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vmware.cache_prefix = devstack-image-cache {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.570940] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vmware.cluster_name = testcl1 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.571115] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vmware.connection_pool_size = 10 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.571276] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vmware.console_delay_seconds = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.571443] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vmware.datastore_regex = ^datastore.* {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.571653] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vmware.host_ip = vc1.osci.c.eu-de-1.cloud.sap {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.571825] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vmware.host_password = **** {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.571991] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vmware.host_port = 443 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.572171] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vmware.host_username = administrator@vsphere.local {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.572340] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vmware.insecure = True {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.572500] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vmware.integration_bridge = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.572660] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vmware.maximum_objects = 100 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.572912] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vmware.pbm_default_policy = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.572967] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vmware.pbm_enabled = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.573132] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vmware.pbm_wsdl_location = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.573297] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.573457] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vmware.serial_port_proxy_uri = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.573611] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vmware.serial_port_service_uri = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.573771] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vmware.task_poll_interval = 0.5 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.573940] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vmware.use_linked_clone = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.574129] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vmware.vnc_keymap = en-us {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.574295] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vmware.vnc_port = 5900 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.574458] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vmware.vnc_port_total = 10000 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.574665] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vnc.auth_schemes = ['none'] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.574813] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vnc.enabled = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.575124] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.575348] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.575481] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vnc.novncproxy_port = 6080 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.575660] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vnc.server_listen = 127.0.0.1 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.575831] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vnc.server_proxyclient_address = 127.0.0.1 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.575992] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vnc.vencrypt_ca_certs = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.576163] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vnc.vencrypt_client_cert = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.576318] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vnc.vencrypt_client_key = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.576497] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.576818] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] workarounds.disable_fallback_pcpu_query = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.576818] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] workarounds.disable_group_policy_check_upcall = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.576969] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.577140] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] workarounds.disable_rootwrap = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.577305] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] workarounds.enable_numa_live_migration = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.577488] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.577655] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.577809] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] workarounds.handle_virt_lifecycle_events = True {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.577967] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] workarounds.libvirt_disable_apic = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.578140] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] workarounds.never_download_image_if_on_rbd = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.578309] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.578516] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.578691] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.578857] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.579028] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.579196] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.579359] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.579523] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.579710] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.579906] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.580089] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] wsgi.client_socket_timeout = 900 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.580258] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] wsgi.default_pool_size = 1000 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.580424] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] wsgi.keep_alive = True {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.580590] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] wsgi.max_header_line = 16384 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.580748] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] wsgi.secure_proxy_ssl_header = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.580907] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] wsgi.ssl_ca_file = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.581073] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] wsgi.ssl_cert_file = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.581234] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] wsgi.ssl_key_file = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.581396] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] wsgi.tcp_keepidle = 600 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.581567] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.581730] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] zvm.ca_file = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.581886] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] zvm.cloud_connector_url = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.582191] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] zvm.image_tmp_path = /opt/stack/data/n-cpu-1/images {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.582365] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] zvm.reachable_timeout = 300 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.582683] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_policy.enforce_new_defaults = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.582783] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_policy.enforce_scope = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.582884] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_policy.policy_default_rule = default {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.583076] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.583249] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_policy.policy_file = policy.yaml {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.583418] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.583578] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.583734] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.583890] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.584062] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.584232] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.584406] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.584580] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] profiler.connection_string = messaging:// {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.584748] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] profiler.enabled = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.584915] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] profiler.es_doc_type = notification {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.585089] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] profiler.es_scroll_size = 10000 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.585258] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] profiler.es_scroll_time = 2m {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.585447] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] profiler.filter_error_trace = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.585583] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] profiler.hmac_keys = SECRET_KEY {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.585750] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] profiler.sentinel_service_name = mymaster {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.585918] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] profiler.socket_timeout = 0.1 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.586091] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] profiler.trace_sqlalchemy = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.586259] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] remote_debug.host = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.586417] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] remote_debug.port = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.586592] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.586754] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.586915] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.587089] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.587254] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.587462] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.587633] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.587796] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.587956] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.588125] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.588298] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.588542] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.588731] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.588901] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.589078] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.589255] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.589418] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.589582] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.589769] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.589939] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.590114] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.590283] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.590445] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.590610] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.590775] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.590939] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_messaging_rabbit.ssl = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.591120] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.591294] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.591460] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.591628] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.591798] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_messaging_rabbit.ssl_version = {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.591984] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.592162] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_messaging_notifications.retry = -1 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.592345] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.592523] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_messaging_notifications.transport_url = **** {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.592691] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_limit.auth_section = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.592850] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_limit.auth_type = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.593015] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_limit.cafile = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.593174] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_limit.certfile = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.593339] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_limit.collect_timing = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.593499] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_limit.connect_retries = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.593656] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_limit.connect_retry_delay = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.593813] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_limit.endpoint_id = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.593970] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_limit.endpoint_override = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.594144] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_limit.insecure = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.594324] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_limit.keyfile = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.594516] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_limit.max_version = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.594681] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_limit.min_version = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.594860] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_limit.region_name = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.595084] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_limit.service_name = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.595261] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_limit.service_type = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.595431] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_limit.split_loggers = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.595622] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_limit.status_code_retries = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.595791] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_limit.status_code_retry_delay = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.595950] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_limit.timeout = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.596122] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_limit.valid_interfaces = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.596281] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_limit.version = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.596446] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_reports.file_event_handler = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.596617] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_reports.file_event_handler_interval = 1 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.596770] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] oslo_reports.log_dir = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.596939] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.597128] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vif_plug_linux_bridge_privileged.group = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.597330] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.597501] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.597670] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vif_plug_linux_bridge_privileged.thread_pool_size = 8 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.597829] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vif_plug_linux_bridge_privileged.user = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.598011] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.598175] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vif_plug_ovs_privileged.group = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.598350] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vif_plug_ovs_privileged.helper_command = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.598526] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.598689] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vif_plug_ovs_privileged.thread_pool_size = 8 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.598845] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] vif_plug_ovs_privileged.user = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.599025] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] os_vif_linux_bridge.flat_interface = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.599209] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.599381] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.599552] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.599746] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.599924] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.600105] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.600269] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] os_vif_linux_bridge.vlan_interface = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.600448] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] os_vif_ovs.isolate_vif = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.600618] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.600784] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.600953] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.601133] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] os_vif_ovs.ovsdb_interface = native {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.601297] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] os_vif_ovs.per_port_bridge = False {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.601462] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] os_brick.lock_path = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.601627] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] privsep_osbrick.capabilities = [21] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.601785] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] privsep_osbrick.group = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.601940] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] privsep_osbrick.helper_command = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.602113] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.602277] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] privsep_osbrick.thread_pool_size = 8 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.602432] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] privsep_osbrick.user = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.602602] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.602758] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] nova_sys_admin.group = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.602912] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] nova_sys_admin.helper_command = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.603084] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.603248] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] nova_sys_admin.thread_pool_size = 8 {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.603407] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] nova_sys_admin.user = None {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 549.603540] env[60164]: DEBUG oslo_service.service [None req-2ec29404-4cac-4f88-aa73-8a7124f44978 None None] ******************************************************************************** {{(pid=60164) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2613}} [ 549.603954] env[60164]: INFO nova.service [-] Starting compute node (version 0.1.0) [ 549.614594] env[60164]: INFO nova.virt.node [None req-cf3a4e03-e191-42f7-9a21-26571b1c47a6 None None] Generated node identity ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f [ 549.614838] env[60164]: INFO nova.virt.node [None req-cf3a4e03-e191-42f7-9a21-26571b1c47a6 None None] Wrote node identity ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f to /opt/stack/data/n-cpu-1/compute_id [ 549.625697] env[60164]: WARNING nova.compute.manager [None req-cf3a4e03-e191-42f7-9a21-26571b1c47a6 None None] Compute nodes ['ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f'] for host cpu-1 were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. [ 549.656544] env[60164]: INFO nova.compute.manager [None req-cf3a4e03-e191-42f7-9a21-26571b1c47a6 None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host [ 549.676140] env[60164]: WARNING nova.compute.manager [None req-cf3a4e03-e191-42f7-9a21-26571b1c47a6 None None] No compute node record found for host cpu-1. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host cpu-1 could not be found. [ 549.676374] env[60164]: DEBUG oslo_concurrency.lockutils [None req-cf3a4e03-e191-42f7-9a21-26571b1c47a6 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 549.676572] env[60164]: DEBUG oslo_concurrency.lockutils [None req-cf3a4e03-e191-42f7-9a21-26571b1c47a6 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 549.676715] env[60164]: DEBUG oslo_concurrency.lockutils [None req-cf3a4e03-e191-42f7-9a21-26571b1c47a6 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 549.676869] env[60164]: DEBUG nova.compute.resource_tracker [None req-cf3a4e03-e191-42f7-9a21-26571b1c47a6 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60164) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 549.678028] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-460c94e0-8c18-4e4a-bcac-bb777cd41c7e {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 549.686981] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c894a09a-54cd-4bf3-8132-dc9c19ece49c {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 549.701016] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-837537ff-b5b2-4832-a356-7d1d264e1e73 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 549.707415] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73ba0386-1ae3-4072-8ca3-2035f0d70ffa {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 549.736496] env[60164]: DEBUG nova.compute.resource_tracker [None req-cf3a4e03-e191-42f7-9a21-26571b1c47a6 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181500MB free_disk=139GB free_vcpus=48 pci_devices=None {{(pid=60164) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 549.736639] env[60164]: DEBUG oslo_concurrency.lockutils [None req-cf3a4e03-e191-42f7-9a21-26571b1c47a6 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 549.736813] env[60164]: DEBUG oslo_concurrency.lockutils [None req-cf3a4e03-e191-42f7-9a21-26571b1c47a6 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 549.748226] env[60164]: WARNING nova.compute.resource_tracker [None req-cf3a4e03-e191-42f7-9a21-26571b1c47a6 None None] No compute node record for cpu-1:ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f could not be found. [ 549.762618] env[60164]: INFO nova.compute.resource_tracker [None req-cf3a4e03-e191-42f7-9a21-26571b1c47a6 None None] Compute node record created for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 with uuid: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f [ 549.810593] env[60164]: DEBUG nova.compute.resource_tracker [None req-cf3a4e03-e191-42f7-9a21-26571b1c47a6 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=60164) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 549.810753] env[60164]: DEBUG nova.compute.resource_tracker [None req-cf3a4e03-e191-42f7-9a21-26571b1c47a6 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=60164) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 549.908198] env[60164]: INFO nova.scheduler.client.report [None req-cf3a4e03-e191-42f7-9a21-26571b1c47a6 None None] [req-598dab6b-db40-4409-8b03-64fb07077fca] Created resource provider record via placement API for resource provider with UUID ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f and name domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28. [ 549.924015] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55553dd2-fd4d-4cd2-a62a-b33ce431029b {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 549.931961] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35b32b71-a4cb-4a99-b93f-bfc2f8a0a023 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 549.962815] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6582a91f-7356-45e3-8210-210ab7a81373 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 549.968504] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9dbd0d06-4c9f-4120-bc10-990bff3578c4 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 549.981357] env[60164]: DEBUG nova.compute.provider_tree [None req-cf3a4e03-e191-42f7-9a21-26571b1c47a6 None None] Updating inventory in ProviderTree for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 550.015033] env[60164]: DEBUG nova.scheduler.client.report [None req-cf3a4e03-e191-42f7-9a21-26571b1c47a6 None None] Updated inventory for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f with generation 0 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 550.015273] env[60164]: DEBUG nova.compute.provider_tree [None req-cf3a4e03-e191-42f7-9a21-26571b1c47a6 None None] Updating resource provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f generation from 0 to 1 during operation: update_inventory {{(pid=60164) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 550.015444] env[60164]: DEBUG nova.compute.provider_tree [None req-cf3a4e03-e191-42f7-9a21-26571b1c47a6 None None] Updating inventory in ProviderTree for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 550.056763] env[60164]: DEBUG nova.compute.provider_tree [None req-cf3a4e03-e191-42f7-9a21-26571b1c47a6 None None] Updating resource provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f generation from 1 to 2 during operation: update_traits {{(pid=60164) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 550.073269] env[60164]: DEBUG nova.compute.resource_tracker [None req-cf3a4e03-e191-42f7-9a21-26571b1c47a6 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60164) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 550.073440] env[60164]: DEBUG oslo_concurrency.lockutils [None req-cf3a4e03-e191-42f7-9a21-26571b1c47a6 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.337s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 550.073595] env[60164]: DEBUG nova.service [None req-cf3a4e03-e191-42f7-9a21-26571b1c47a6 None None] Creating RPC server for service compute {{(pid=60164) start /opt/stack/nova/nova/service.py:182}} [ 550.086644] env[60164]: DEBUG nova.service [None req-cf3a4e03-e191-42f7-9a21-26571b1c47a6 None None] Join ServiceGroup membership for this service compute {{(pid=60164) start /opt/stack/nova/nova/service.py:199}} [ 550.086827] env[60164]: DEBUG nova.servicegroup.drivers.db [None req-cf3a4e03-e191-42f7-9a21-26571b1c47a6 None None] DB_Driver: join new ServiceGroup member cpu-1 to the compute group, service = {{(pid=60164) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 568.093332] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._sync_power_states {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 568.103688] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Getting list of instances from cluster (obj){ [ 568.103688] env[60164]: value = "domain-c8" [ 568.103688] env[60164]: _type = "ClusterComputeResource" [ 568.103688] env[60164]: } {{(pid=60164) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 568.104822] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d67cb84c-ac5d-4954-aa9d-bfefe1e58e1a {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 568.113929] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Got total of 0 instances {{(pid=60164) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 568.114156] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 568.114445] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Getting list of instances from cluster (obj){ [ 568.114445] env[60164]: value = "domain-c8" [ 568.114445] env[60164]: _type = "ClusterComputeResource" [ 568.114445] env[60164]: } {{(pid=60164) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 568.115349] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f25c7423-4c71-47ed-a73b-286016ea0f8f {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 568.122625] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Got total of 0 instances {{(pid=60164) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 592.725682] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] Acquiring lock "f1894e2a-156c-420c-91af-a4eedaafb017" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 592.726015] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] Lock "f1894e2a-156c-420c-91af-a4eedaafb017" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 592.746327] env[60164]: DEBUG nova.compute.manager [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] [instance: f1894e2a-156c-420c-91af-a4eedaafb017] Starting instance... {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} [ 592.852398] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 592.852750] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 592.855092] env[60164]: INFO nova.compute.claims [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] [instance: f1894e2a-156c-420c-91af-a4eedaafb017] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 592.996189] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a30604dd-af4a-495b-9d72-e380f1bddb79 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 593.007288] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9d63d80-14f0-4fa3-b33d-3f2500ef734e {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 593.042443] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c02b25b-a71d-4ffc-916e-5152466a9e22 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 593.051227] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-065b0189-676b-4d78-b3a8-010d7433371c {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 593.063881] env[60164]: DEBUG nova.compute.provider_tree [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 593.079693] env[60164]: DEBUG nova.scheduler.client.report [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 593.128512] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.273s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 593.128512] env[60164]: DEBUG nova.compute.manager [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] [instance: f1894e2a-156c-420c-91af-a4eedaafb017] Start building networks asynchronously for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} [ 593.166609] env[60164]: DEBUG nova.compute.utils [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] Using /dev/sd instead of None {{(pid=60164) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 593.167954] env[60164]: DEBUG nova.compute.manager [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] [instance: f1894e2a-156c-420c-91af-a4eedaafb017] Allocating IP information in the background. {{(pid=60164) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 593.168200] env[60164]: DEBUG nova.network.neutron [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] [instance: f1894e2a-156c-420c-91af-a4eedaafb017] allocate_for_instance() {{(pid=60164) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1143}} [ 593.183594] env[60164]: DEBUG nova.compute.manager [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] [instance: f1894e2a-156c-420c-91af-a4eedaafb017] Start building block device mappings for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} [ 593.266417] env[60164]: DEBUG nova.compute.manager [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] [instance: f1894e2a-156c-420c-91af-a4eedaafb017] Start spawning the instance on the hypervisor. {{(pid=60164) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} [ 593.547344] env[60164]: DEBUG oslo_concurrency.lockutils [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] Acquiring lock "c731bc6a-9b0d-4e3a-b5ca-009d79896d27" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 593.547588] env[60164]: DEBUG oslo_concurrency.lockutils [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] Lock "c731bc6a-9b0d-4e3a-b5ca-009d79896d27" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 593.569326] env[60164]: DEBUG nova.compute.manager [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] Starting instance... {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} [ 593.635094] env[60164]: DEBUG oslo_concurrency.lockutils [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 593.635094] env[60164]: DEBUG oslo_concurrency.lockutils [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 593.636681] env[60164]: INFO nova.compute.claims [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 593.739276] env[60164]: DEBUG nova.virt.hardware [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-15T19:01:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-15T19:01:11Z,direct_url=,disk_format='vmdk',id=1618eb55-f00d-42a5-b978-e81e57855fb4,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='4c19b57a5b764a6480c1f6e2a9cdd5e4',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-15T19:01:12Z,virtual_size=,visibility=), allow threads: False {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 593.739627] env[60164]: DEBUG nova.virt.hardware [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] Flavor limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 593.739820] env[60164]: DEBUG nova.virt.hardware [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] Image limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 593.740551] env[60164]: DEBUG nova.virt.hardware [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] Flavor pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 593.740551] env[60164]: DEBUG nova.virt.hardware [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] Image pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 593.740551] env[60164]: DEBUG nova.virt.hardware [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 593.740551] env[60164]: DEBUG nova.virt.hardware [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 593.740741] env[60164]: DEBUG nova.virt.hardware [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 593.741094] env[60164]: DEBUG nova.virt.hardware [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] Got 1 possible topologies {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 593.741270] env[60164]: DEBUG nova.virt.hardware [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 593.741457] env[60164]: DEBUG nova.virt.hardware [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 593.742718] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6625aec5-3f09-469a-9ec7-317966c733b6 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 593.762913] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a5ccd36-4889-4d9e-8b27-0f76899b980a {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 593.770973] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07d52148-9c96-41a8-9d9f-eb55e02312a0 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 593.794767] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8fe64695-c2bb-44e9-9df8-e49109f2cf77 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 593.810865] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c864fee-d70f-4d1f-b566-718b1e225adf {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 593.851887] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22338f6b-f009-47e4-9a4f-f991bc5f83ca {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 593.859318] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c72f202-52bd-472e-a201-8a39cc5c7483 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 593.872682] env[60164]: DEBUG nova.compute.provider_tree [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 593.881815] env[60164]: DEBUG nova.scheduler.client.report [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 593.899472] env[60164]: DEBUG oslo_concurrency.lockutils [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.264s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 593.899775] env[60164]: DEBUG nova.compute.manager [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] Start building networks asynchronously for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} [ 593.941020] env[60164]: DEBUG nova.compute.utils [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] Using /dev/sd instead of None {{(pid=60164) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 593.944978] env[60164]: DEBUG nova.compute.manager [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] Allocating IP information in the background. {{(pid=60164) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 593.945297] env[60164]: DEBUG nova.network.neutron [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] allocate_for_instance() {{(pid=60164) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1143}} [ 593.969162] env[60164]: DEBUG nova.compute.manager [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] Start building block device mappings for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} [ 594.071336] env[60164]: DEBUG nova.compute.manager [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] Start spawning the instance on the hypervisor. {{(pid=60164) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} [ 594.115179] env[60164]: DEBUG nova.policy [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8116a4b14c4148ce9863afdb6dd7d571', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '62b2e3dba50f46108124ecd6f560df03', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60164) authorize /opt/stack/nova/nova/policy.py:203}} [ 594.119164] env[60164]: DEBUG nova.virt.hardware [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-15T19:01:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-15T19:01:11Z,direct_url=,disk_format='vmdk',id=1618eb55-f00d-42a5-b978-e81e57855fb4,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='4c19b57a5b764a6480c1f6e2a9cdd5e4',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-15T19:01:12Z,virtual_size=,visibility=), allow threads: False {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 594.119376] env[60164]: DEBUG nova.virt.hardware [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] Flavor limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 594.119524] env[60164]: DEBUG nova.virt.hardware [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] Image limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 594.119694] env[60164]: DEBUG nova.virt.hardware [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] Flavor pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 594.119827] env[60164]: DEBUG nova.virt.hardware [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] Image pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 594.119961] env[60164]: DEBUG nova.virt.hardware [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 594.120169] env[60164]: DEBUG nova.virt.hardware [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 594.120316] env[60164]: DEBUG nova.virt.hardware [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 594.120467] env[60164]: DEBUG nova.virt.hardware [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] Got 1 possible topologies {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 594.120691] env[60164]: DEBUG nova.virt.hardware [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 594.120869] env[60164]: DEBUG nova.virt.hardware [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 594.122105] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b76ffea-b15b-4dc7-8716-23477f6f4ba0 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 594.127491] env[60164]: DEBUG nova.policy [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '74c20d44ea7e4427ae91b7d5cb67941f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '73f02488e0f04572afd9b672c2b5cf0f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60164) authorize /opt/stack/nova/nova/policy.py:203}} [ 594.134754] env[60164]: DEBUG oslo_concurrency.lockutils [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] Acquiring lock "09e936da-040a-438a-a320-28616de7bb75" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 594.135037] env[60164]: DEBUG oslo_concurrency.lockutils [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] Lock "09e936da-040a-438a-a320-28616de7bb75" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 594.140210] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f5cfe59-0ccb-479e-99ec-20365ccb4eaf {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 594.161860] env[60164]: DEBUG nova.compute.manager [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] [instance: 09e936da-040a-438a-a320-28616de7bb75] Starting instance... {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} [ 594.225886] env[60164]: DEBUG oslo_concurrency.lockutils [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 594.225886] env[60164]: DEBUG oslo_concurrency.lockutils [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 594.226289] env[60164]: INFO nova.compute.claims [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] [instance: 09e936da-040a-438a-a320-28616de7bb75] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 594.355632] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd25170c-d10a-471a-825f-c0d5e0c82bab {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 594.363520] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e784f73-51ec-4f15-ac3d-6c5dae43d3b1 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 594.396345] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7752e2a6-ea69-49ed-8165-fc4f65388e4f {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 594.408852] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95ec8ed9-1b9a-4353-8737-7b75be733456 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 594.424138] env[60164]: DEBUG nova.compute.provider_tree [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 594.436519] env[60164]: DEBUG nova.scheduler.client.report [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 594.452625] env[60164]: DEBUG oslo_concurrency.lockutils [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.228s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 594.455664] env[60164]: DEBUG nova.compute.manager [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] [instance: 09e936da-040a-438a-a320-28616de7bb75] Start building networks asynchronously for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} [ 594.495536] env[60164]: DEBUG nova.compute.utils [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] Using /dev/sd instead of None {{(pid=60164) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 594.496915] env[60164]: DEBUG nova.compute.manager [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] [instance: 09e936da-040a-438a-a320-28616de7bb75] Allocating IP information in the background. {{(pid=60164) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 594.498458] env[60164]: DEBUG nova.network.neutron [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] [instance: 09e936da-040a-438a-a320-28616de7bb75] allocate_for_instance() {{(pid=60164) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1143}} [ 594.514817] env[60164]: DEBUG nova.compute.manager [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] [instance: 09e936da-040a-438a-a320-28616de7bb75] Start building block device mappings for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} [ 594.622091] env[60164]: DEBUG nova.compute.manager [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] [instance: 09e936da-040a-438a-a320-28616de7bb75] Start spawning the instance on the hypervisor. {{(pid=60164) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} [ 594.654785] env[60164]: DEBUG nova.virt.hardware [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-15T19:01:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-15T19:01:11Z,direct_url=,disk_format='vmdk',id=1618eb55-f00d-42a5-b978-e81e57855fb4,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='4c19b57a5b764a6480c1f6e2a9cdd5e4',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-15T19:01:12Z,virtual_size=,visibility=), allow threads: False {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 594.654785] env[60164]: DEBUG nova.virt.hardware [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] Flavor limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 594.654785] env[60164]: DEBUG nova.virt.hardware [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] Image limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 594.655659] env[60164]: DEBUG nova.virt.hardware [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] Flavor pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 594.655887] env[60164]: DEBUG nova.virt.hardware [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] Image pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 594.656095] env[60164]: DEBUG nova.virt.hardware [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 594.656322] env[60164]: DEBUG nova.virt.hardware [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 594.656536] env[60164]: DEBUG nova.virt.hardware [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 594.658114] env[60164]: DEBUG nova.virt.hardware [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] Got 1 possible topologies {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 594.658114] env[60164]: DEBUG nova.virt.hardware [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 594.658114] env[60164]: DEBUG nova.virt.hardware [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 594.658114] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a66a2ae-e020-4479-b18e-e70abd652b02 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 594.667561] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a8ce621-bd08-470b-837d-1293830e43f6 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 595.018548] env[60164]: DEBUG nova.policy [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1e7b888f72654a79b43f8a2c47299416', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '859a3caa90ab40879a27f7eb2ba8908b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60164) authorize /opt/stack/nova/nova/policy.py:203}} [ 595.813805] env[60164]: DEBUG nova.network.neutron [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] Successfully created port: bd6c3933-bc0e-43ec-b6e3-dae79283705c {{(pid=60164) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 595.971564] env[60164]: DEBUG nova.network.neutron [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] [instance: f1894e2a-156c-420c-91af-a4eedaafb017] Successfully created port: 61b1b63d-b5cb-4d06-b6ff-cda84a011f31 {{(pid=60164) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 596.171442] env[60164]: DEBUG oslo_concurrency.lockutils [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] Acquiring lock "3681b35b-c962-4e80-8f9c-df0db2f515e9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 596.172423] env[60164]: DEBUG oslo_concurrency.lockutils [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] Lock "3681b35b-c962-4e80-8f9c-df0db2f515e9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 596.196899] env[60164]: DEBUG oslo_concurrency.lockutils [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] Acquiring lock "dbb50c25-381f-4878-945b-170f2681f2ae" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 596.196899] env[60164]: DEBUG oslo_concurrency.lockutils [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] Lock "dbb50c25-381f-4878-945b-170f2681f2ae" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 596.196899] env[60164]: DEBUG nova.compute.manager [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] Starting instance... {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} [ 596.219838] env[60164]: DEBUG nova.compute.manager [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] [instance: dbb50c25-381f-4878-945b-170f2681f2ae] Starting instance... {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} [ 596.270386] env[60164]: DEBUG oslo_concurrency.lockutils [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 596.273912] env[60164]: DEBUG oslo_concurrency.lockutils [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 596.273912] env[60164]: INFO nova.compute.claims [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 596.282624] env[60164]: DEBUG oslo_concurrency.lockutils [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 596.439469] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc806f22-8223-48b7-861f-5dc8fb1b4521 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 596.456161] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14b67fe3-fa5e-4336-b470-9f7f87f3c61f {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 596.495567] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95d8f809-e307-4720-ba39-15eb1de5255f {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 596.503741] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8dd3a294-1d8d-4cdd-8ba4-5725208812b3 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 596.519129] env[60164]: DEBUG nova.compute.provider_tree [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 596.529923] env[60164]: DEBUG nova.scheduler.client.report [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 596.549166] env[60164]: DEBUG oslo_concurrency.lockutils [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.278s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 596.551767] env[60164]: DEBUG nova.compute.manager [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] Start building networks asynchronously for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} [ 596.558634] env[60164]: DEBUG oslo_concurrency.lockutils [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.276s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 596.559918] env[60164]: INFO nova.compute.claims [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] [instance: dbb50c25-381f-4878-945b-170f2681f2ae] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 596.607035] env[60164]: DEBUG nova.compute.utils [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] Using /dev/sd instead of None {{(pid=60164) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 596.608506] env[60164]: DEBUG nova.compute.manager [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] Allocating IP information in the background. {{(pid=60164) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 596.608506] env[60164]: DEBUG nova.network.neutron [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] allocate_for_instance() {{(pid=60164) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1143}} [ 596.626995] env[60164]: DEBUG nova.compute.manager [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] Start building block device mappings for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} [ 596.636634] env[60164]: DEBUG nova.network.neutron [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] [instance: 09e936da-040a-438a-a320-28616de7bb75] Successfully created port: 4aa0c292-e1ce-4f56-9399-62866cbc7ba7 {{(pid=60164) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 596.713132] env[60164]: DEBUG nova.compute.manager [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] Start spawning the instance on the hypervisor. {{(pid=60164) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} [ 596.744872] env[60164]: DEBUG nova.virt.hardware [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-15T19:01:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-15T19:01:11Z,direct_url=,disk_format='vmdk',id=1618eb55-f00d-42a5-b978-e81e57855fb4,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='4c19b57a5b764a6480c1f6e2a9cdd5e4',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-15T19:01:12Z,virtual_size=,visibility=), allow threads: False {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 596.745358] env[60164]: DEBUG nova.virt.hardware [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] Flavor limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 596.745358] env[60164]: DEBUG nova.virt.hardware [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] Image limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 596.747176] env[60164]: DEBUG nova.virt.hardware [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] Flavor pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 596.747342] env[60164]: DEBUG nova.virt.hardware [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] Image pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 596.747499] env[60164]: DEBUG nova.virt.hardware [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 596.747728] env[60164]: DEBUG nova.virt.hardware [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 596.747884] env[60164]: DEBUG nova.virt.hardware [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 596.748058] env[60164]: DEBUG nova.virt.hardware [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] Got 1 possible topologies {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 596.748217] env[60164]: DEBUG nova.virt.hardware [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 596.748382] env[60164]: DEBUG nova.virt.hardware [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 596.749281] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89ea6c92-9a32-4ce7-adda-84cd9106cf38 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 596.753386] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ebeb6f6-23d8-4e46-995e-bd56fc5da32c {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 596.764163] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1873d438-c8ce-4f6d-964d-9c06107930b1 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 596.770080] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e06cd86-1234-4ac2-8d63-e84869966f5b {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 596.809128] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65e2ae53-c93a-49a1-a867-eb7709104198 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 596.818030] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-238b2a57-8eed-4461-9362-95a23d61a57d {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 596.831223] env[60164]: DEBUG nova.compute.provider_tree [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 596.848250] env[60164]: DEBUG nova.scheduler.client.report [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 596.881750] env[60164]: DEBUG oslo_concurrency.lockutils [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.322s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 596.882874] env[60164]: DEBUG nova.compute.manager [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] [instance: dbb50c25-381f-4878-945b-170f2681f2ae] Start building networks asynchronously for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} [ 596.940555] env[60164]: DEBUG nova.compute.utils [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] Using /dev/sd instead of None {{(pid=60164) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 596.946301] env[60164]: DEBUG nova.compute.manager [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] [instance: dbb50c25-381f-4878-945b-170f2681f2ae] Allocating IP information in the background. {{(pid=60164) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 596.946615] env[60164]: DEBUG nova.network.neutron [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] [instance: dbb50c25-381f-4878-945b-170f2681f2ae] allocate_for_instance() {{(pid=60164) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1143}} [ 596.967647] env[60164]: DEBUG nova.compute.manager [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] [instance: dbb50c25-381f-4878-945b-170f2681f2ae] Start building block device mappings for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} [ 597.066638] env[60164]: DEBUG nova.policy [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'aedbb3047ca14d2aa7cd02b30a892ba2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bd26fa229f324ee7aca2569d83ed0032', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60164) authorize /opt/stack/nova/nova/policy.py:203}} [ 597.080667] env[60164]: DEBUG nova.compute.manager [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] [instance: dbb50c25-381f-4878-945b-170f2681f2ae] Start spawning the instance on the hypervisor. {{(pid=60164) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} [ 597.120522] env[60164]: DEBUG nova.virt.hardware [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-15T19:01:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-15T19:01:11Z,direct_url=,disk_format='vmdk',id=1618eb55-f00d-42a5-b978-e81e57855fb4,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='4c19b57a5b764a6480c1f6e2a9cdd5e4',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-15T19:01:12Z,virtual_size=,visibility=), allow threads: False {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 597.120522] env[60164]: DEBUG nova.virt.hardware [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] Flavor limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 597.120522] env[60164]: DEBUG nova.virt.hardware [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] Image limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 597.120742] env[60164]: DEBUG nova.virt.hardware [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] Flavor pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 597.120742] env[60164]: DEBUG nova.virt.hardware [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] Image pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 597.120742] env[60164]: DEBUG nova.virt.hardware [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 597.120859] env[60164]: DEBUG nova.virt.hardware [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 597.121022] env[60164]: DEBUG nova.virt.hardware [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 597.125115] env[60164]: DEBUG nova.virt.hardware [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] Got 1 possible topologies {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 597.125247] env[60164]: DEBUG nova.virt.hardware [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 597.125475] env[60164]: DEBUG nova.virt.hardware [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 597.126693] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aad03cd2-5c08-4ca5-80a4-18044dfbfd68 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 597.141354] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f2804eb-d497-4ea2-a049-1780a83a4076 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 597.291885] env[60164]: DEBUG nova.policy [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a30fc787cf3745b9bc8e385a9f26749f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fc4180d54c1c427081e8004cc159478f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60164) authorize /opt/stack/nova/nova/policy.py:203}} [ 598.219349] env[60164]: DEBUG oslo_concurrency.lockutils [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] Acquiring lock "4dbfaeea-229a-4ed1-afb2-bd8e167a1385" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 598.219630] env[60164]: DEBUG oslo_concurrency.lockutils [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] Lock "4dbfaeea-229a-4ed1-afb2-bd8e167a1385" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 598.229589] env[60164]: DEBUG nova.compute.manager [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] Starting instance... {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} [ 598.297628] env[60164]: DEBUG oslo_concurrency.lockutils [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 598.298612] env[60164]: DEBUG oslo_concurrency.lockutils [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 598.300957] env[60164]: INFO nova.compute.claims [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 598.480437] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56d748e6-1c50-4058-a359-9cb31f5bdd2f {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 598.488692] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e32ffb19-c9b7-4ee3-a260-6ac6127dffe1 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 598.525217] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66a7466f-f580-484e-809e-3713d603fe79 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 598.534484] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4cf47372-3317-4af7-9925-d6ecdee870a9 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 598.548415] env[60164]: DEBUG nova.compute.provider_tree [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 598.562535] env[60164]: DEBUG nova.scheduler.client.report [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 598.578258] env[60164]: DEBUG oslo_concurrency.lockutils [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.280s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 598.578775] env[60164]: DEBUG nova.compute.manager [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] Start building networks asynchronously for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} [ 598.628094] env[60164]: DEBUG nova.compute.utils [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] Using /dev/sd instead of None {{(pid=60164) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 598.631977] env[60164]: DEBUG nova.compute.manager [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] Allocating IP information in the background. {{(pid=60164) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 598.632236] env[60164]: DEBUG nova.network.neutron [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] allocate_for_instance() {{(pid=60164) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1143}} [ 598.646114] env[60164]: DEBUG nova.compute.manager [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] Start building block device mappings for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} [ 598.736072] env[60164]: DEBUG nova.compute.manager [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] Start spawning the instance on the hypervisor. {{(pid=60164) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} [ 598.773479] env[60164]: DEBUG nova.virt.hardware [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-15T19:01:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-15T19:01:11Z,direct_url=,disk_format='vmdk',id=1618eb55-f00d-42a5-b978-e81e57855fb4,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='4c19b57a5b764a6480c1f6e2a9cdd5e4',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-15T19:01:12Z,virtual_size=,visibility=), allow threads: False {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 598.773702] env[60164]: DEBUG nova.virt.hardware [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] Flavor limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 598.773848] env[60164]: DEBUG nova.virt.hardware [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] Image limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 598.774761] env[60164]: DEBUG nova.virt.hardware [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] Flavor pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 598.774761] env[60164]: DEBUG nova.virt.hardware [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] Image pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 598.774761] env[60164]: DEBUG nova.virt.hardware [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 598.774761] env[60164]: DEBUG nova.virt.hardware [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 598.774761] env[60164]: DEBUG nova.virt.hardware [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 598.774961] env[60164]: DEBUG nova.virt.hardware [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] Got 1 possible topologies {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 598.774961] env[60164]: DEBUG nova.virt.hardware [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 598.775271] env[60164]: DEBUG nova.virt.hardware [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 598.776289] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a69cc82-a719-4575-b9df-4f4db8267589 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 598.784700] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-805e495c-18a5-47d5-a550-dbd6fcfa3917 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 599.207808] env[60164]: DEBUG nova.policy [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'be9e4c9d8e1249f7b4691aa88666769d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '33983ea7f05b435da567e01fa0715162', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60164) authorize /opt/stack/nova/nova/policy.py:203}} [ 599.331502] env[60164]: DEBUG nova.network.neutron [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] Successfully created port: 83c6510e-dc8c-4f57-b4d1-a3af8164ac57 {{(pid=60164) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 599.345862] env[60164]: DEBUG nova.network.neutron [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] [instance: dbb50c25-381f-4878-945b-170f2681f2ae] Successfully created port: 8d00bd2a-2402-4f54-9624-959f29f21eda {{(pid=60164) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 600.822517] env[60164]: DEBUG oslo_concurrency.lockutils [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] Acquiring lock "93b26227-ad64-4343-aed9-ba6622aaf83e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 600.822841] env[60164]: DEBUG oslo_concurrency.lockutils [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] Lock "93b26227-ad64-4343-aed9-ba6622aaf83e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 600.833514] env[60164]: DEBUG nova.compute.manager [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] Starting instance... {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} [ 600.895094] env[60164]: DEBUG oslo_concurrency.lockutils [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 600.895614] env[60164]: DEBUG oslo_concurrency.lockutils [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 600.896905] env[60164]: INFO nova.compute.claims [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 600.931328] env[60164]: DEBUG nova.network.neutron [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] Successfully created port: 25193a61-bed4-4d25-8d37-d6270569cf1a {{(pid=60164) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 601.066121] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41d82dca-12e2-48bb-afb5-1f7d9d6b4f3f {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 601.075312] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73e645aa-74aa-41f8-80c8-45024d2659ab {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 601.112142] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47617376-ad08-4c15-a13b-424a348b5fa9 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 601.119921] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01165769-9653-4b2e-a1df-9db49d7777d2 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 601.135750] env[60164]: DEBUG nova.compute.provider_tree [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 601.146120] env[60164]: DEBUG nova.scheduler.client.report [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 601.162731] env[60164]: DEBUG oslo_concurrency.lockutils [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.267s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 601.163763] env[60164]: DEBUG nova.compute.manager [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] Start building networks asynchronously for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} [ 601.205746] env[60164]: DEBUG nova.compute.utils [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] Using /dev/sd instead of None {{(pid=60164) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 601.207671] env[60164]: DEBUG nova.compute.manager [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] Allocating IP information in the background. {{(pid=60164) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 601.207897] env[60164]: DEBUG nova.network.neutron [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] allocate_for_instance() {{(pid=60164) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1143}} [ 601.218884] env[60164]: DEBUG nova.compute.manager [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] Start building block device mappings for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} [ 601.322191] env[60164]: DEBUG nova.compute.manager [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] Start spawning the instance on the hypervisor. {{(pid=60164) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} [ 601.350072] env[60164]: DEBUG nova.virt.hardware [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-15T19:01:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-15T19:01:11Z,direct_url=,disk_format='vmdk',id=1618eb55-f00d-42a5-b978-e81e57855fb4,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='4c19b57a5b764a6480c1f6e2a9cdd5e4',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-15T19:01:12Z,virtual_size=,visibility=), allow threads: False {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 601.350320] env[60164]: DEBUG nova.virt.hardware [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] Flavor limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 601.350477] env[60164]: DEBUG nova.virt.hardware [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] Image limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 601.350639] env[60164]: DEBUG nova.virt.hardware [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] Flavor pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 601.350782] env[60164]: DEBUG nova.virt.hardware [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] Image pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 601.351204] env[60164]: DEBUG nova.virt.hardware [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 601.354563] env[60164]: DEBUG nova.virt.hardware [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 601.355237] env[60164]: DEBUG nova.virt.hardware [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 601.355237] env[60164]: DEBUG nova.virt.hardware [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] Got 1 possible topologies {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 601.355237] env[60164]: DEBUG nova.virt.hardware [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 601.355435] env[60164]: DEBUG nova.virt.hardware [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 601.356142] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-174c916b-9c83-413d-9e28-fd9874b7c7f1 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 601.367655] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c0e077c-666a-4deb-bbc6-c4379ccf1beb {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 601.463673] env[60164]: DEBUG nova.policy [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8c45812d2d1b498eabb8228cdd1fd48b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5275c625e8af40b28d4612c1473115df', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60164) authorize /opt/stack/nova/nova/policy.py:203}} [ 604.187235] env[60164]: DEBUG nova.network.neutron [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] Successfully created port: 26abace3-118f-40c6-882e-a81b73e76917 {{(pid=60164) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 604.450621] env[60164]: ERROR nova.compute.manager [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 61b1b63d-b5cb-4d06-b6ff-cda84a011f31, please check neutron logs for more information. [ 604.450621] env[60164]: ERROR nova.compute.manager Traceback (most recent call last): [ 604.450621] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 604.450621] env[60164]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 604.450621] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 604.450621] env[60164]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 604.450621] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 604.450621] env[60164]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 604.450621] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 604.450621] env[60164]: ERROR nova.compute.manager self.force_reraise() [ 604.450621] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 604.450621] env[60164]: ERROR nova.compute.manager raise self.value [ 604.450621] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 604.450621] env[60164]: ERROR nova.compute.manager updated_port = self._update_port( [ 604.450621] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 604.450621] env[60164]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 604.452986] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 604.452986] env[60164]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 604.452986] env[60164]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 61b1b63d-b5cb-4d06-b6ff-cda84a011f31, please check neutron logs for more information. [ 604.452986] env[60164]: ERROR nova.compute.manager [ 604.452986] env[60164]: Traceback (most recent call last): [ 604.452986] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 604.452986] env[60164]: listener.cb(fileno) [ 604.452986] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 604.452986] env[60164]: result = function(*args, **kwargs) [ 604.452986] env[60164]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 604.452986] env[60164]: return func(*args, **kwargs) [ 604.452986] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 604.452986] env[60164]: raise e [ 604.452986] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 604.452986] env[60164]: nwinfo = self.network_api.allocate_for_instance( [ 604.452986] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 604.452986] env[60164]: created_port_ids = self._update_ports_for_instance( [ 604.452986] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 604.452986] env[60164]: with excutils.save_and_reraise_exception(): [ 604.452986] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 604.452986] env[60164]: self.force_reraise() [ 604.452986] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 604.452986] env[60164]: raise self.value [ 604.452986] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 604.452986] env[60164]: updated_port = self._update_port( [ 604.452986] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 604.452986] env[60164]: _ensure_no_port_binding_failure(port) [ 604.452986] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 604.452986] env[60164]: raise exception.PortBindingFailed(port_id=port['id']) [ 604.453830] env[60164]: nova.exception.PortBindingFailed: Binding failed for port 61b1b63d-b5cb-4d06-b6ff-cda84a011f31, please check neutron logs for more information. [ 604.453830] env[60164]: Removing descriptor: 12 [ 604.453830] env[60164]: ERROR nova.compute.manager [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] [instance: f1894e2a-156c-420c-91af-a4eedaafb017] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 61b1b63d-b5cb-4d06-b6ff-cda84a011f31, please check neutron logs for more information. [ 604.453830] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] Traceback (most recent call last): [ 604.453830] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] File "/opt/stack/nova/nova/compute/manager.py", line 2849, in _build_resources [ 604.453830] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] yield resources [ 604.453830] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 604.453830] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] self.driver.spawn(context, instance, image_meta, [ 604.453830] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 604.453830] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] self._vmops.spawn(context, instance, image_meta, injected_files, [ 604.453830] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 604.453830] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] vm_ref = self.build_virtual_machine(instance, [ 604.454133] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 604.454133] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] vif_infos = vmwarevif.get_vif_info(self._session, [ 604.454133] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 604.454133] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] for vif in network_info: [ 604.454133] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 604.454133] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] return self._sync_wrapper(fn, *args, **kwargs) [ 604.454133] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 604.454133] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] self.wait() [ 604.454133] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 604.454133] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] self[:] = self._gt.wait() [ 604.454133] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 604.454133] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] return self._exit_event.wait() [ 604.454133] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 604.454133] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] result = hub.switch() [ 604.454475] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 604.454475] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] return self.greenlet.switch() [ 604.454475] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 604.454475] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] result = function(*args, **kwargs) [ 604.454475] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 604.454475] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] return func(*args, **kwargs) [ 604.454475] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 604.454475] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] raise e [ 604.454475] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 604.454475] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] nwinfo = self.network_api.allocate_for_instance( [ 604.454475] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 604.454475] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] created_port_ids = self._update_ports_for_instance( [ 604.454475] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 604.454797] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] with excutils.save_and_reraise_exception(): [ 604.454797] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 604.454797] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] self.force_reraise() [ 604.454797] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 604.454797] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] raise self.value [ 604.454797] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 604.454797] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] updated_port = self._update_port( [ 604.454797] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 604.454797] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] _ensure_no_port_binding_failure(port) [ 604.454797] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 604.454797] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] raise exception.PortBindingFailed(port_id=port['id']) [ 604.454797] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] nova.exception.PortBindingFailed: Binding failed for port 61b1b63d-b5cb-4d06-b6ff-cda84a011f31, please check neutron logs for more information. [ 604.454797] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] [ 604.455131] env[60164]: INFO nova.compute.manager [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] [instance: f1894e2a-156c-420c-91af-a4eedaafb017] Terminating instance [ 604.455861] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] Acquiring lock "refresh_cache-f1894e2a-156c-420c-91af-a4eedaafb017" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 604.456033] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] Acquired lock "refresh_cache-f1894e2a-156c-420c-91af-a4eedaafb017" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 604.456192] env[60164]: DEBUG nova.network.neutron [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] [instance: f1894e2a-156c-420c-91af-a4eedaafb017] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 604.593666] env[60164]: DEBUG nova.network.neutron [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] [instance: f1894e2a-156c-420c-91af-a4eedaafb017] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 604.655960] env[60164]: DEBUG nova.compute.manager [req-4710e15f-e705-4447-a628-a482202fb395 req-55ae4fee-07e8-4736-b380-816bfaa8bad4 service nova] [instance: f1894e2a-156c-420c-91af-a4eedaafb017] Received event network-changed-61b1b63d-b5cb-4d06-b6ff-cda84a011f31 {{(pid=60164) external_instance_event /opt/stack/nova/nova/compute/manager.py:10979}} [ 604.656207] env[60164]: DEBUG nova.compute.manager [req-4710e15f-e705-4447-a628-a482202fb395 req-55ae4fee-07e8-4736-b380-816bfaa8bad4 service nova] [instance: f1894e2a-156c-420c-91af-a4eedaafb017] Refreshing instance network info cache due to event network-changed-61b1b63d-b5cb-4d06-b6ff-cda84a011f31. {{(pid=60164) external_instance_event /opt/stack/nova/nova/compute/manager.py:10984}} [ 604.656334] env[60164]: DEBUG oslo_concurrency.lockutils [req-4710e15f-e705-4447-a628-a482202fb395 req-55ae4fee-07e8-4736-b380-816bfaa8bad4 service nova] Acquiring lock "refresh_cache-f1894e2a-156c-420c-91af-a4eedaafb017" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 604.850104] env[60164]: ERROR nova.compute.manager [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port bd6c3933-bc0e-43ec-b6e3-dae79283705c, please check neutron logs for more information. [ 604.850104] env[60164]: ERROR nova.compute.manager Traceback (most recent call last): [ 604.850104] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 604.850104] env[60164]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 604.850104] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 604.850104] env[60164]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 604.850104] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 604.850104] env[60164]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 604.850104] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 604.850104] env[60164]: ERROR nova.compute.manager self.force_reraise() [ 604.850104] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 604.850104] env[60164]: ERROR nova.compute.manager raise self.value [ 604.850104] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 604.850104] env[60164]: ERROR nova.compute.manager updated_port = self._update_port( [ 604.850104] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 604.850104] env[60164]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 604.850530] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 604.850530] env[60164]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 604.850530] env[60164]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port bd6c3933-bc0e-43ec-b6e3-dae79283705c, please check neutron logs for more information. [ 604.850530] env[60164]: ERROR nova.compute.manager [ 604.850530] env[60164]: Traceback (most recent call last): [ 604.850530] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 604.850530] env[60164]: listener.cb(fileno) [ 604.850530] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 604.850530] env[60164]: result = function(*args, **kwargs) [ 604.850530] env[60164]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 604.850530] env[60164]: return func(*args, **kwargs) [ 604.850530] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 604.850530] env[60164]: raise e [ 604.850530] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 604.850530] env[60164]: nwinfo = self.network_api.allocate_for_instance( [ 604.850530] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 604.850530] env[60164]: created_port_ids = self._update_ports_for_instance( [ 604.850530] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 604.850530] env[60164]: with excutils.save_and_reraise_exception(): [ 604.850530] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 604.850530] env[60164]: self.force_reraise() [ 604.850530] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 604.850530] env[60164]: raise self.value [ 604.850530] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 604.850530] env[60164]: updated_port = self._update_port( [ 604.850530] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 604.850530] env[60164]: _ensure_no_port_binding_failure(port) [ 604.850530] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 604.850530] env[60164]: raise exception.PortBindingFailed(port_id=port['id']) [ 604.851338] env[60164]: nova.exception.PortBindingFailed: Binding failed for port bd6c3933-bc0e-43ec-b6e3-dae79283705c, please check neutron logs for more information. [ 604.851338] env[60164]: Removing descriptor: 14 [ 604.851338] env[60164]: ERROR nova.compute.manager [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port bd6c3933-bc0e-43ec-b6e3-dae79283705c, please check neutron logs for more information. [ 604.851338] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] Traceback (most recent call last): [ 604.851338] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] File "/opt/stack/nova/nova/compute/manager.py", line 2849, in _build_resources [ 604.851338] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] yield resources [ 604.851338] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 604.851338] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] self.driver.spawn(context, instance, image_meta, [ 604.851338] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 604.851338] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] self._vmops.spawn(context, instance, image_meta, injected_files, [ 604.851338] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 604.851338] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] vm_ref = self.build_virtual_machine(instance, [ 604.851729] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 604.851729] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] vif_infos = vmwarevif.get_vif_info(self._session, [ 604.851729] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 604.851729] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] for vif in network_info: [ 604.851729] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 604.851729] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] return self._sync_wrapper(fn, *args, **kwargs) [ 604.851729] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 604.851729] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] self.wait() [ 604.851729] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 604.851729] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] self[:] = self._gt.wait() [ 604.851729] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 604.851729] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] return self._exit_event.wait() [ 604.851729] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 604.851729] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] result = hub.switch() [ 604.852130] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 604.852130] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] return self.greenlet.switch() [ 604.852130] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 604.852130] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] result = function(*args, **kwargs) [ 604.852130] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 604.852130] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] return func(*args, **kwargs) [ 604.852130] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 604.852130] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] raise e [ 604.852130] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 604.852130] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] nwinfo = self.network_api.allocate_for_instance( [ 604.852130] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 604.852130] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] created_port_ids = self._update_ports_for_instance( [ 604.852130] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 604.852466] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] with excutils.save_and_reraise_exception(): [ 604.852466] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 604.852466] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] self.force_reraise() [ 604.852466] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 604.852466] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] raise self.value [ 604.852466] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 604.852466] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] updated_port = self._update_port( [ 604.852466] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 604.852466] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] _ensure_no_port_binding_failure(port) [ 604.852466] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 604.852466] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] raise exception.PortBindingFailed(port_id=port['id']) [ 604.852466] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] nova.exception.PortBindingFailed: Binding failed for port bd6c3933-bc0e-43ec-b6e3-dae79283705c, please check neutron logs for more information. [ 604.852466] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] [ 604.852778] env[60164]: INFO nova.compute.manager [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] Terminating instance [ 604.855161] env[60164]: DEBUG oslo_concurrency.lockutils [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] Acquiring lock "refresh_cache-c731bc6a-9b0d-4e3a-b5ca-009d79896d27" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 604.855161] env[60164]: DEBUG oslo_concurrency.lockutils [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] Acquired lock "refresh_cache-c731bc6a-9b0d-4e3a-b5ca-009d79896d27" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 604.855161] env[60164]: DEBUG nova.network.neutron [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 604.992434] env[60164]: DEBUG nova.network.neutron [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 605.236652] env[60164]: DEBUG nova.network.neutron [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] [instance: f1894e2a-156c-420c-91af-a4eedaafb017] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 605.250528] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] Releasing lock "refresh_cache-f1894e2a-156c-420c-91af-a4eedaafb017" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 605.250947] env[60164]: DEBUG nova.compute.manager [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] [instance: f1894e2a-156c-420c-91af-a4eedaafb017] Start destroying the instance on the hypervisor. {{(pid=60164) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} [ 605.251530] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] [instance: f1894e2a-156c-420c-91af-a4eedaafb017] Destroying instance {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 605.251967] env[60164]: DEBUG oslo_concurrency.lockutils [req-4710e15f-e705-4447-a628-a482202fb395 req-55ae4fee-07e8-4736-b380-816bfaa8bad4 service nova] Acquired lock "refresh_cache-f1894e2a-156c-420c-91af-a4eedaafb017" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 605.252202] env[60164]: DEBUG nova.network.neutron [req-4710e15f-e705-4447-a628-a482202fb395 req-55ae4fee-07e8-4736-b380-816bfaa8bad4 service nova] [instance: f1894e2a-156c-420c-91af-a4eedaafb017] Refreshing network info cache for port 61b1b63d-b5cb-4d06-b6ff-cda84a011f31 {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1986}} [ 605.256159] env[60164]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-67d1b959-19bb-40ff-95c5-1008fa9fc4e0 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 605.266921] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7de377e-4554-4984-b9a1-abfad2ed278d {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 605.293739] env[60164]: WARNING nova.virt.vmwareapi.vmops [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] [instance: f1894e2a-156c-420c-91af-a4eedaafb017] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance f1894e2a-156c-420c-91af-a4eedaafb017 could not be found. [ 605.293959] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] [instance: f1894e2a-156c-420c-91af-a4eedaafb017] Instance destroyed {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 605.294358] env[60164]: INFO nova.compute.manager [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] [instance: f1894e2a-156c-420c-91af-a4eedaafb017] Took 0.04 seconds to destroy the instance on the hypervisor. [ 605.294626] env[60164]: DEBUG oslo.service.loopingcall [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60164) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 605.294817] env[60164]: DEBUG nova.compute.manager [-] [instance: f1894e2a-156c-420c-91af-a4eedaafb017] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 605.294912] env[60164]: DEBUG nova.network.neutron [-] [instance: f1894e2a-156c-420c-91af-a4eedaafb017] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 605.369972] env[60164]: DEBUG nova.network.neutron [req-4710e15f-e705-4447-a628-a482202fb395 req-55ae4fee-07e8-4736-b380-816bfaa8bad4 service nova] [instance: f1894e2a-156c-420c-91af-a4eedaafb017] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 605.385067] env[60164]: DEBUG nova.network.neutron [-] [instance: f1894e2a-156c-420c-91af-a4eedaafb017] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 605.393622] env[60164]: DEBUG nova.network.neutron [-] [instance: f1894e2a-156c-420c-91af-a4eedaafb017] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 605.403257] env[60164]: INFO nova.compute.manager [-] [instance: f1894e2a-156c-420c-91af-a4eedaafb017] Took 0.11 seconds to deallocate network for instance. [ 605.407332] env[60164]: DEBUG nova.compute.claims [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] [instance: f1894e2a-156c-420c-91af-a4eedaafb017] Aborting claim: {{(pid=60164) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 605.407519] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 605.407737] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 605.503202] env[60164]: DEBUG nova.network.neutron [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 605.516725] env[60164]: DEBUG oslo_concurrency.lockutils [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] Releasing lock "refresh_cache-c731bc6a-9b0d-4e3a-b5ca-009d79896d27" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 605.517161] env[60164]: DEBUG nova.compute.manager [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] Start destroying the instance on the hypervisor. {{(pid=60164) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} [ 605.517388] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] Destroying instance {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 605.517925] env[60164]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-a57a08a3-2a84-4879-a9b2-92829030967c {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 605.531601] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10d1aa6f-1127-4a22-8574-fc3516401b94 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 605.557745] env[60164]: WARNING nova.virt.vmwareapi.vmops [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance c731bc6a-9b0d-4e3a-b5ca-009d79896d27 could not be found. [ 605.557860] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] Instance destroyed {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 605.557974] env[60164]: INFO nova.compute.manager [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] Took 0.04 seconds to destroy the instance on the hypervisor. [ 605.558234] env[60164]: DEBUG oslo.service.loopingcall [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60164) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 605.558502] env[60164]: DEBUG nova.compute.manager [-] [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 605.558601] env[60164]: DEBUG nova.network.neutron [-] [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 605.713300] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6587ace-1a51-4d61-9e3b-0aaa5b3a2d69 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 605.718535] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9a0290a-5641-49bc-9d76-271bf5657084 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 605.752343] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-617267f5-25af-4649-8820-2707b9aa7f87 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 605.759185] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7f4b20c-0516-4c1e-aaf7-334f8f830364 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 605.774953] env[60164]: DEBUG nova.compute.provider_tree [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 605.783543] env[60164]: DEBUG nova.scheduler.client.report [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 605.803644] env[60164]: ERROR nova.compute.manager [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 4aa0c292-e1ce-4f56-9399-62866cbc7ba7, please check neutron logs for more information. [ 605.803644] env[60164]: ERROR nova.compute.manager Traceback (most recent call last): [ 605.803644] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 605.803644] env[60164]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 605.803644] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 605.803644] env[60164]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 605.803644] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 605.803644] env[60164]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 605.803644] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 605.803644] env[60164]: ERROR nova.compute.manager self.force_reraise() [ 605.803644] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 605.803644] env[60164]: ERROR nova.compute.manager raise self.value [ 605.803644] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 605.803644] env[60164]: ERROR nova.compute.manager updated_port = self._update_port( [ 605.803644] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 605.803644] env[60164]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 605.804114] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 605.804114] env[60164]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 605.804114] env[60164]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 4aa0c292-e1ce-4f56-9399-62866cbc7ba7, please check neutron logs for more information. [ 605.804114] env[60164]: ERROR nova.compute.manager [ 605.804114] env[60164]: Traceback (most recent call last): [ 605.804114] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 605.804114] env[60164]: listener.cb(fileno) [ 605.804114] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 605.804114] env[60164]: result = function(*args, **kwargs) [ 605.804114] env[60164]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 605.804114] env[60164]: return func(*args, **kwargs) [ 605.804114] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 605.804114] env[60164]: raise e [ 605.804114] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 605.804114] env[60164]: nwinfo = self.network_api.allocate_for_instance( [ 605.804114] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 605.804114] env[60164]: created_port_ids = self._update_ports_for_instance( [ 605.804114] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 605.804114] env[60164]: with excutils.save_and_reraise_exception(): [ 605.804114] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 605.804114] env[60164]: self.force_reraise() [ 605.804114] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 605.804114] env[60164]: raise self.value [ 605.804114] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 605.804114] env[60164]: updated_port = self._update_port( [ 605.804114] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 605.804114] env[60164]: _ensure_no_port_binding_failure(port) [ 605.804114] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 605.804114] env[60164]: raise exception.PortBindingFailed(port_id=port['id']) [ 605.805230] env[60164]: nova.exception.PortBindingFailed: Binding failed for port 4aa0c292-e1ce-4f56-9399-62866cbc7ba7, please check neutron logs for more information. [ 605.805230] env[60164]: Removing descriptor: 15 [ 605.805230] env[60164]: ERROR nova.compute.manager [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] [instance: 09e936da-040a-438a-a320-28616de7bb75] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 4aa0c292-e1ce-4f56-9399-62866cbc7ba7, please check neutron logs for more information. [ 605.805230] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] Traceback (most recent call last): [ 605.805230] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] File "/opt/stack/nova/nova/compute/manager.py", line 2849, in _build_resources [ 605.805230] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] yield resources [ 605.805230] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 605.805230] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] self.driver.spawn(context, instance, image_meta, [ 605.805230] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 605.805230] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] self._vmops.spawn(context, instance, image_meta, injected_files, [ 605.805230] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 605.805230] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] vm_ref = self.build_virtual_machine(instance, [ 605.805554] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 605.805554] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] vif_infos = vmwarevif.get_vif_info(self._session, [ 605.805554] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 605.805554] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] for vif in network_info: [ 605.805554] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 605.805554] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] return self._sync_wrapper(fn, *args, **kwargs) [ 605.805554] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 605.805554] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] self.wait() [ 605.805554] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 605.805554] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] self[:] = self._gt.wait() [ 605.805554] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 605.805554] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] return self._exit_event.wait() [ 605.805554] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 605.805554] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] result = hub.switch() [ 605.805898] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 605.805898] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] return self.greenlet.switch() [ 605.805898] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 605.805898] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] result = function(*args, **kwargs) [ 605.805898] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 605.805898] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] return func(*args, **kwargs) [ 605.805898] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 605.805898] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] raise e [ 605.805898] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 605.805898] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] nwinfo = self.network_api.allocate_for_instance( [ 605.805898] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 605.805898] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] created_port_ids = self._update_ports_for_instance( [ 605.805898] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 605.806395] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] with excutils.save_and_reraise_exception(): [ 605.806395] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 605.806395] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] self.force_reraise() [ 605.806395] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 605.806395] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] raise self.value [ 605.806395] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 605.806395] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] updated_port = self._update_port( [ 605.806395] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 605.806395] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] _ensure_no_port_binding_failure(port) [ 605.806395] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 605.806395] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] raise exception.PortBindingFailed(port_id=port['id']) [ 605.806395] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] nova.exception.PortBindingFailed: Binding failed for port 4aa0c292-e1ce-4f56-9399-62866cbc7ba7, please check neutron logs for more information. [ 605.806395] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] [ 605.806757] env[60164]: INFO nova.compute.manager [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] [instance: 09e936da-040a-438a-a320-28616de7bb75] Terminating instance [ 605.807534] env[60164]: DEBUG oslo_concurrency.lockutils [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] Acquiring lock "refresh_cache-09e936da-040a-438a-a320-28616de7bb75" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 605.807749] env[60164]: DEBUG oslo_concurrency.lockutils [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] Acquired lock "refresh_cache-09e936da-040a-438a-a320-28616de7bb75" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 605.807964] env[60164]: DEBUG nova.network.neutron [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] [instance: 09e936da-040a-438a-a320-28616de7bb75] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 605.811164] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.403s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 605.811164] env[60164]: ERROR nova.compute.manager [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] [instance: f1894e2a-156c-420c-91af-a4eedaafb017] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 61b1b63d-b5cb-4d06-b6ff-cda84a011f31, please check neutron logs for more information. [ 605.811164] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] Traceback (most recent call last): [ 605.811164] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 605.811164] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] self.driver.spawn(context, instance, image_meta, [ 605.811164] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 605.811164] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] self._vmops.spawn(context, instance, image_meta, injected_files, [ 605.811164] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 605.811164] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] vm_ref = self.build_virtual_machine(instance, [ 605.811411] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 605.811411] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] vif_infos = vmwarevif.get_vif_info(self._session, [ 605.811411] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 605.811411] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] for vif in network_info: [ 605.811411] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 605.811411] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] return self._sync_wrapper(fn, *args, **kwargs) [ 605.811411] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 605.811411] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] self.wait() [ 605.811411] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 605.811411] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] self[:] = self._gt.wait() [ 605.811411] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 605.811411] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] return self._exit_event.wait() [ 605.811411] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 605.811411] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] result = hub.switch() [ 605.811777] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 605.811777] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] return self.greenlet.switch() [ 605.811777] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 605.811777] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] result = function(*args, **kwargs) [ 605.811777] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 605.811777] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] return func(*args, **kwargs) [ 605.811777] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 605.811777] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] raise e [ 605.811777] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 605.811777] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] nwinfo = self.network_api.allocate_for_instance( [ 605.811777] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 605.811777] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] created_port_ids = self._update_ports_for_instance( [ 605.811777] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 605.812133] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] with excutils.save_and_reraise_exception(): [ 605.812133] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 605.812133] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] self.force_reraise() [ 605.812133] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 605.812133] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] raise self.value [ 605.812133] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 605.812133] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] updated_port = self._update_port( [ 605.812133] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 605.812133] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] _ensure_no_port_binding_failure(port) [ 605.812133] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 605.812133] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] raise exception.PortBindingFailed(port_id=port['id']) [ 605.812133] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] nova.exception.PortBindingFailed: Binding failed for port 61b1b63d-b5cb-4d06-b6ff-cda84a011f31, please check neutron logs for more information. [ 605.812133] env[60164]: ERROR nova.compute.manager [instance: f1894e2a-156c-420c-91af-a4eedaafb017] [ 605.812452] env[60164]: DEBUG nova.compute.utils [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] [instance: f1894e2a-156c-420c-91af-a4eedaafb017] Binding failed for port 61b1b63d-b5cb-4d06-b6ff-cda84a011f31, please check neutron logs for more information. {{(pid=60164) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 605.816308] env[60164]: DEBUG nova.compute.manager [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] [instance: f1894e2a-156c-420c-91af-a4eedaafb017] Build of instance f1894e2a-156c-420c-91af-a4eedaafb017 was re-scheduled: Binding failed for port 61b1b63d-b5cb-4d06-b6ff-cda84a011f31, please check neutron logs for more information. {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 605.817277] env[60164]: DEBUG nova.compute.manager [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] [instance: f1894e2a-156c-420c-91af-a4eedaafb017] Unplugging VIFs for instance {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} [ 605.817354] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] Acquiring lock "refresh_cache-f1894e2a-156c-420c-91af-a4eedaafb017" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 605.839077] env[60164]: DEBUG nova.network.neutron [-] [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 605.851177] env[60164]: DEBUG nova.network.neutron [-] [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 605.862293] env[60164]: INFO nova.compute.manager [-] [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] Took 0.30 seconds to deallocate network for instance. [ 605.867035] env[60164]: DEBUG nova.compute.claims [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] Aborting claim: {{(pid=60164) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 605.867035] env[60164]: DEBUG oslo_concurrency.lockutils [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 605.867035] env[60164]: DEBUG oslo_concurrency.lockutils [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 605.896644] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 605.896644] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 605.896644] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Starting heal instance info cache {{(pid=60164) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9789}} [ 605.896644] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Rebuilding the list of instances to heal {{(pid=60164) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9793}} [ 605.920574] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] Skipping network cache update for instance because it is Building. {{(pid=60164) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9802}} [ 605.920745] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] [instance: 09e936da-040a-438a-a320-28616de7bb75] Skipping network cache update for instance because it is Building. {{(pid=60164) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9802}} [ 605.921060] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] Skipping network cache update for instance because it is Building. {{(pid=60164) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9802}} [ 605.921060] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] [instance: dbb50c25-381f-4878-945b-170f2681f2ae] Skipping network cache update for instance because it is Building. {{(pid=60164) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9802}} [ 605.921215] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] Skipping network cache update for instance because it is Building. {{(pid=60164) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9802}} [ 605.921269] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] Skipping network cache update for instance because it is Building. {{(pid=60164) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9802}} [ 605.921374] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Didn't find any instances for network info cache update. {{(pid=60164) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9875}} [ 605.921877] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 605.923361] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 605.923361] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 605.923361] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 605.923361] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 605.923361] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 605.923361] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60164) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10408}} [ 605.925894] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 605.936139] env[60164]: DEBUG nova.network.neutron [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] [instance: 09e936da-040a-438a-a320-28616de7bb75] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 605.940387] env[60164]: DEBUG oslo_concurrency.lockutils [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 606.025651] env[60164]: DEBUG nova.network.neutron [req-4710e15f-e705-4447-a628-a482202fb395 req-55ae4fee-07e8-4736-b380-816bfaa8bad4 service nova] [instance: f1894e2a-156c-420c-91af-a4eedaafb017] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 606.045609] env[60164]: DEBUG oslo_concurrency.lockutils [req-4710e15f-e705-4447-a628-a482202fb395 req-55ae4fee-07e8-4736-b380-816bfaa8bad4 service nova] Releasing lock "refresh_cache-f1894e2a-156c-420c-91af-a4eedaafb017" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 606.046147] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] Acquired lock "refresh_cache-f1894e2a-156c-420c-91af-a4eedaafb017" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 606.046537] env[60164]: DEBUG nova.network.neutron [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] [instance: f1894e2a-156c-420c-91af-a4eedaafb017] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 606.093425] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9eed19fe-3185-42c2-b677-77bf6055a525 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.106163] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7904b82b-2469-43be-b0a2-a340adb3484e {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.140327] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-006c130e-c249-4ec8-bfdf-c89b568af999 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.147942] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c7604c5-9d22-43b1-88ad-9ab9d9faeb87 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.163132] env[60164]: DEBUG nova.compute.provider_tree [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 606.174622] env[60164]: DEBUG nova.scheduler.client.report [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 606.200225] env[60164]: DEBUG oslo_concurrency.lockutils [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.335s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 606.200875] env[60164]: ERROR nova.compute.manager [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port bd6c3933-bc0e-43ec-b6e3-dae79283705c, please check neutron logs for more information. [ 606.200875] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] Traceback (most recent call last): [ 606.200875] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 606.200875] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] self.driver.spawn(context, instance, image_meta, [ 606.200875] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 606.200875] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] self._vmops.spawn(context, instance, image_meta, injected_files, [ 606.200875] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 606.200875] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] vm_ref = self.build_virtual_machine(instance, [ 606.200875] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 606.200875] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] vif_infos = vmwarevif.get_vif_info(self._session, [ 606.200875] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 606.201269] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] for vif in network_info: [ 606.201269] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 606.201269] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] return self._sync_wrapper(fn, *args, **kwargs) [ 606.201269] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 606.201269] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] self.wait() [ 606.201269] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 606.201269] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] self[:] = self._gt.wait() [ 606.201269] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 606.201269] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] return self._exit_event.wait() [ 606.201269] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 606.201269] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] result = hub.switch() [ 606.201269] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 606.201269] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] return self.greenlet.switch() [ 606.201269] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 606.201657] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] result = function(*args, **kwargs) [ 606.201657] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 606.201657] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] return func(*args, **kwargs) [ 606.201657] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 606.201657] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] raise e [ 606.201657] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 606.201657] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] nwinfo = self.network_api.allocate_for_instance( [ 606.201657] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 606.201657] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] created_port_ids = self._update_ports_for_instance( [ 606.201657] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 606.201657] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] with excutils.save_and_reraise_exception(): [ 606.201657] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 606.201657] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] self.force_reraise() [ 606.201992] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 606.201992] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] raise self.value [ 606.201992] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 606.201992] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] updated_port = self._update_port( [ 606.201992] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 606.201992] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] _ensure_no_port_binding_failure(port) [ 606.201992] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 606.201992] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] raise exception.PortBindingFailed(port_id=port['id']) [ 606.201992] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] nova.exception.PortBindingFailed: Binding failed for port bd6c3933-bc0e-43ec-b6e3-dae79283705c, please check neutron logs for more information. [ 606.201992] env[60164]: ERROR nova.compute.manager [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] [ 606.202265] env[60164]: DEBUG nova.compute.utils [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] Binding failed for port bd6c3933-bc0e-43ec-b6e3-dae79283705c, please check neutron logs for more information. {{(pid=60164) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 606.204088] env[60164]: DEBUG oslo_concurrency.lockutils [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.264s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 606.204245] env[60164]: DEBUG oslo_concurrency.lockutils [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 606.204414] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60164) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 606.205261] env[60164]: DEBUG nova.compute.manager [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] Build of instance c731bc6a-9b0d-4e3a-b5ca-009d79896d27 was re-scheduled: Binding failed for port bd6c3933-bc0e-43ec-b6e3-dae79283705c, please check neutron logs for more information. {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 606.205552] env[60164]: DEBUG nova.compute.manager [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] Unplugging VIFs for instance {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} [ 606.205772] env[60164]: DEBUG oslo_concurrency.lockutils [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] Acquiring lock "refresh_cache-c731bc6a-9b0d-4e3a-b5ca-009d79896d27" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 606.205912] env[60164]: DEBUG oslo_concurrency.lockutils [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] Acquired lock "refresh_cache-c731bc6a-9b0d-4e3a-b5ca-009d79896d27" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 606.206079] env[60164]: DEBUG nova.network.neutron [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 606.210124] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-492b4e78-0b8e-4948-a0ad-1c0955f894bf {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.221752] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f22c6745-d33d-4fa9-adbb-31ed85c1f413 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.241665] env[60164]: DEBUG nova.network.neutron [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] [instance: f1894e2a-156c-420c-91af-a4eedaafb017] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 606.244372] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6dbeff58-57d9-4f84-91a6-1a5819fee6a9 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.253053] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-afdc906f-3070-4b40-9bd8-0e4a6c43b9c1 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.285442] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181505MB free_disk=139GB free_vcpus=48 pci_devices=None {{(pid=60164) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 606.285594] env[60164]: DEBUG oslo_concurrency.lockutils [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 606.285783] env[60164]: DEBUG oslo_concurrency.lockutils [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 606.376652] env[60164]: DEBUG nova.network.neutron [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 606.463986] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Instance f1894e2a-156c-420c-91af-a4eedaafb017 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60164) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1689}} [ 606.496543] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Instance c731bc6a-9b0d-4e3a-b5ca-009d79896d27 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60164) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1689}} [ 606.496543] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Instance 09e936da-040a-438a-a320-28616de7bb75 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60164) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 606.496543] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Instance 3681b35b-c962-4e80-8f9c-df0db2f515e9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60164) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 606.496543] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Instance dbb50c25-381f-4878-945b-170f2681f2ae actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60164) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 606.496911] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Instance 4dbfaeea-229a-4ed1-afb2-bd8e167a1385 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60164) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 606.496911] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Instance 93b26227-ad64-4343-aed9-ba6622aaf83e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60164) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 606.496911] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Total usable vcpus: 48, total allocated vcpus: 5 {{(pid=60164) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 606.496911] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1152MB phys_disk=200GB used_disk=5GB total_vcpus=48 used_vcpus=5 pci_stats=[] {{(pid=60164) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 606.600641] env[60164]: DEBUG nova.network.neutron [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] [instance: 09e936da-040a-438a-a320-28616de7bb75] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 606.618443] env[60164]: DEBUG oslo_concurrency.lockutils [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] Releasing lock "refresh_cache-09e936da-040a-438a-a320-28616de7bb75" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 606.618443] env[60164]: DEBUG nova.compute.manager [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] [instance: 09e936da-040a-438a-a320-28616de7bb75] Start destroying the instance on the hypervisor. {{(pid=60164) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} [ 606.618443] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] [instance: 09e936da-040a-438a-a320-28616de7bb75] Destroying instance {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 606.618443] env[60164]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c7425bf0-e9b2-4346-a041-7f88436f2f0f {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.631527] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf50899e-1c14-457f-b6e7-ced6e6e40ac9 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.650785] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a71b07e-aef2-4143-b1c2-c2aad2ba0d0d {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.660995] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-727a8e64-16e6-438c-bad1-9d8587faf296 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.670716] env[60164]: WARNING nova.virt.vmwareapi.vmops [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] [instance: 09e936da-040a-438a-a320-28616de7bb75] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 09e936da-040a-438a-a320-28616de7bb75 could not be found. [ 606.670960] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] [instance: 09e936da-040a-438a-a320-28616de7bb75] Instance destroyed {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 606.671148] env[60164]: INFO nova.compute.manager [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] [instance: 09e936da-040a-438a-a320-28616de7bb75] Took 0.05 seconds to destroy the instance on the hypervisor. [ 606.671382] env[60164]: DEBUG oslo.service.loopingcall [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60164) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 606.673045] env[60164]: DEBUG nova.compute.manager [-] [instance: 09e936da-040a-438a-a320-28616de7bb75] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 606.673045] env[60164]: DEBUG nova.network.neutron [-] [instance: 09e936da-040a-438a-a320-28616de7bb75] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 606.700800] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9dd3e082-66b9-45c5-8804-c2fe4ebd2946 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.706922] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd956770-a51f-4e78-8179-96f851671aa9 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.721166] env[60164]: DEBUG nova.compute.provider_tree [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 606.736229] env[60164]: DEBUG nova.scheduler.client.report [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 606.767119] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60164) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 606.767119] env[60164]: DEBUG oslo_concurrency.lockutils [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.480s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 606.949718] env[60164]: DEBUG nova.network.neutron [-] [instance: 09e936da-040a-438a-a320-28616de7bb75] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 606.958658] env[60164]: DEBUG nova.network.neutron [-] [instance: 09e936da-040a-438a-a320-28616de7bb75] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 606.971488] env[60164]: INFO nova.compute.manager [-] [instance: 09e936da-040a-438a-a320-28616de7bb75] Took 0.30 seconds to deallocate network for instance. [ 606.973855] env[60164]: DEBUG nova.compute.claims [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] [instance: 09e936da-040a-438a-a320-28616de7bb75] Aborting claim: {{(pid=60164) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 606.974757] env[60164]: DEBUG oslo_concurrency.lockutils [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 606.974757] env[60164]: DEBUG oslo_concurrency.lockutils [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 607.054265] env[60164]: DEBUG nova.network.neutron [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] [instance: f1894e2a-156c-420c-91af-a4eedaafb017] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 607.068963] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] Releasing lock "refresh_cache-f1894e2a-156c-420c-91af-a4eedaafb017" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 607.069142] env[60164]: DEBUG nova.compute.manager [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2984}} [ 607.069304] env[60164]: DEBUG nova.compute.manager [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] [instance: f1894e2a-156c-420c-91af-a4eedaafb017] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 607.069466] env[60164]: DEBUG nova.network.neutron [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] [instance: f1894e2a-156c-420c-91af-a4eedaafb017] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 607.159833] env[60164]: DEBUG nova.network.neutron [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] [instance: f1894e2a-156c-420c-91af-a4eedaafb017] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 607.174172] env[60164]: DEBUG nova.network.neutron [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] [instance: f1894e2a-156c-420c-91af-a4eedaafb017] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 607.190750] env[60164]: INFO nova.compute.manager [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] [instance: f1894e2a-156c-420c-91af-a4eedaafb017] Took 0.12 seconds to deallocate network for instance. [ 607.202126] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74266d55-24d1-44a3-b048-bb207fe0134c {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 607.214797] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4efbba1e-dde0-4d51-aaf0-10385201b755 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 607.231726] env[60164]: DEBUG nova.network.neutron [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 607.291764] env[60164]: DEBUG oslo_concurrency.lockutils [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] Releasing lock "refresh_cache-c731bc6a-9b0d-4e3a-b5ca-009d79896d27" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 607.292091] env[60164]: DEBUG nova.compute.manager [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2984}} [ 607.292242] env[60164]: DEBUG nova.compute.manager [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 607.292408] env[60164]: DEBUG nova.network.neutron [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 607.296754] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b1c8e6c-80b9-4c0c-a493-0a22518505bf {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 607.310125] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4701aa69-4ed2-4f61-b133-653e52ebe342 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 607.329995] env[60164]: DEBUG nova.compute.provider_tree [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 607.340155] env[60164]: INFO nova.scheduler.client.report [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] Deleted allocations for instance f1894e2a-156c-420c-91af-a4eedaafb017 [ 607.350210] env[60164]: DEBUG nova.scheduler.client.report [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 607.374629] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d2ce4208-2032-4f1b-8d6a-890bf3f1fa09 tempest-ServerDiagnosticsTest-844409055 tempest-ServerDiagnosticsTest-844409055-project-member] Lock "f1894e2a-156c-420c-91af-a4eedaafb017" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 14.645s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 607.403623] env[60164]: DEBUG nova.network.neutron [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 607.418327] env[60164]: DEBUG nova.network.neutron [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 607.420573] env[60164]: DEBUG oslo_concurrency.lockutils [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.446s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 607.421200] env[60164]: ERROR nova.compute.manager [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] [instance: 09e936da-040a-438a-a320-28616de7bb75] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 4aa0c292-e1ce-4f56-9399-62866cbc7ba7, please check neutron logs for more information. [ 607.421200] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] Traceback (most recent call last): [ 607.421200] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 607.421200] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] self.driver.spawn(context, instance, image_meta, [ 607.421200] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 607.421200] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] self._vmops.spawn(context, instance, image_meta, injected_files, [ 607.421200] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 607.421200] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] vm_ref = self.build_virtual_machine(instance, [ 607.421200] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 607.421200] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] vif_infos = vmwarevif.get_vif_info(self._session, [ 607.421200] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 607.421510] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] for vif in network_info: [ 607.421510] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 607.421510] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] return self._sync_wrapper(fn, *args, **kwargs) [ 607.421510] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 607.421510] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] self.wait() [ 607.421510] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 607.421510] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] self[:] = self._gt.wait() [ 607.421510] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 607.421510] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] return self._exit_event.wait() [ 607.421510] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 607.421510] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] result = hub.switch() [ 607.421510] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 607.421510] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] return self.greenlet.switch() [ 607.421510] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 607.421858] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] result = function(*args, **kwargs) [ 607.421858] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 607.421858] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] return func(*args, **kwargs) [ 607.421858] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 607.421858] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] raise e [ 607.421858] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 607.421858] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] nwinfo = self.network_api.allocate_for_instance( [ 607.421858] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 607.421858] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] created_port_ids = self._update_ports_for_instance( [ 607.421858] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 607.421858] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] with excutils.save_and_reraise_exception(): [ 607.421858] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 607.421858] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] self.force_reraise() [ 607.422222] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 607.422222] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] raise self.value [ 607.422222] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 607.422222] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] updated_port = self._update_port( [ 607.422222] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 607.422222] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] _ensure_no_port_binding_failure(port) [ 607.422222] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 607.422222] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] raise exception.PortBindingFailed(port_id=port['id']) [ 607.422222] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] nova.exception.PortBindingFailed: Binding failed for port 4aa0c292-e1ce-4f56-9399-62866cbc7ba7, please check neutron logs for more information. [ 607.422222] env[60164]: ERROR nova.compute.manager [instance: 09e936da-040a-438a-a320-28616de7bb75] [ 607.422222] env[60164]: DEBUG nova.compute.utils [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] [instance: 09e936da-040a-438a-a320-28616de7bb75] Binding failed for port 4aa0c292-e1ce-4f56-9399-62866cbc7ba7, please check neutron logs for more information. {{(pid=60164) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 607.427283] env[60164]: DEBUG nova.compute.manager [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] [instance: 09e936da-040a-438a-a320-28616de7bb75] Build of instance 09e936da-040a-438a-a320-28616de7bb75 was re-scheduled: Binding failed for port 4aa0c292-e1ce-4f56-9399-62866cbc7ba7, please check neutron logs for more information. {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 607.427870] env[60164]: DEBUG nova.compute.manager [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] [instance: 09e936da-040a-438a-a320-28616de7bb75] Unplugging VIFs for instance {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} [ 607.428104] env[60164]: DEBUG oslo_concurrency.lockutils [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] Acquiring lock "refresh_cache-09e936da-040a-438a-a320-28616de7bb75" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 607.428245] env[60164]: DEBUG oslo_concurrency.lockutils [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] Acquired lock "refresh_cache-09e936da-040a-438a-a320-28616de7bb75" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 607.428396] env[60164]: DEBUG nova.network.neutron [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] [instance: 09e936da-040a-438a-a320-28616de7bb75] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 607.430328] env[60164]: INFO nova.compute.manager [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] [instance: c731bc6a-9b0d-4e3a-b5ca-009d79896d27] Took 0.14 seconds to deallocate network for instance. [ 607.516331] env[60164]: DEBUG nova.network.neutron [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] [instance: 09e936da-040a-438a-a320-28616de7bb75] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 607.555944] env[60164]: INFO nova.scheduler.client.report [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] Deleted allocations for instance c731bc6a-9b0d-4e3a-b5ca-009d79896d27 [ 607.591092] env[60164]: DEBUG oslo_concurrency.lockutils [None req-6263ba31-4ac7-4ed2-9e75-ff30010bd9d2 tempest-ServersAdminNegativeTestJSON-2079946617 tempest-ServersAdminNegativeTestJSON-2079946617-project-member] Lock "c731bc6a-9b0d-4e3a-b5ca-009d79896d27" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 14.043s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 608.111043] env[60164]: DEBUG nova.network.neutron [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] [instance: 09e936da-040a-438a-a320-28616de7bb75] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 608.122314] env[60164]: DEBUG oslo_concurrency.lockutils [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] Releasing lock "refresh_cache-09e936da-040a-438a-a320-28616de7bb75" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 608.122314] env[60164]: DEBUG nova.compute.manager [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2984}} [ 608.122314] env[60164]: DEBUG nova.compute.manager [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] [instance: 09e936da-040a-438a-a320-28616de7bb75] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 608.122314] env[60164]: DEBUG nova.network.neutron [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] [instance: 09e936da-040a-438a-a320-28616de7bb75] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 608.218310] env[60164]: DEBUG nova.network.neutron [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] [instance: 09e936da-040a-438a-a320-28616de7bb75] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 608.233095] env[60164]: DEBUG nova.network.neutron [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] [instance: 09e936da-040a-438a-a320-28616de7bb75] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 608.250589] env[60164]: INFO nova.compute.manager [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] [instance: 09e936da-040a-438a-a320-28616de7bb75] Took 0.13 seconds to deallocate network for instance. [ 608.353569] env[60164]: DEBUG nova.compute.manager [req-cf6550cd-cf82-410e-83a1-4db67329afb6 req-fe365b63-d18d-4e24-8704-65fbe155725f service nova] [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] Received event network-changed-83c6510e-dc8c-4f57-b4d1-a3af8164ac57 {{(pid=60164) external_instance_event /opt/stack/nova/nova/compute/manager.py:10979}} [ 608.354239] env[60164]: DEBUG nova.compute.manager [req-cf6550cd-cf82-410e-83a1-4db67329afb6 req-fe365b63-d18d-4e24-8704-65fbe155725f service nova] [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] Refreshing instance network info cache due to event network-changed-83c6510e-dc8c-4f57-b4d1-a3af8164ac57. {{(pid=60164) external_instance_event /opt/stack/nova/nova/compute/manager.py:10984}} [ 608.354239] env[60164]: DEBUG oslo_concurrency.lockutils [req-cf6550cd-cf82-410e-83a1-4db67329afb6 req-fe365b63-d18d-4e24-8704-65fbe155725f service nova] Acquiring lock "refresh_cache-3681b35b-c962-4e80-8f9c-df0db2f515e9" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 608.354741] env[60164]: DEBUG oslo_concurrency.lockutils [req-cf6550cd-cf82-410e-83a1-4db67329afb6 req-fe365b63-d18d-4e24-8704-65fbe155725f service nova] Acquired lock "refresh_cache-3681b35b-c962-4e80-8f9c-df0db2f515e9" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 608.355050] env[60164]: DEBUG nova.network.neutron [req-cf6550cd-cf82-410e-83a1-4db67329afb6 req-fe365b63-d18d-4e24-8704-65fbe155725f service nova] [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] Refreshing network info cache for port 83c6510e-dc8c-4f57-b4d1-a3af8164ac57 {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1986}} [ 608.372055] env[60164]: INFO nova.scheduler.client.report [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] Deleted allocations for instance 09e936da-040a-438a-a320-28616de7bb75 [ 608.396302] env[60164]: DEBUG oslo_concurrency.lockutils [None req-797add29-9b3a-43bc-9bf6-f89dc30b11cd tempest-DeleteServersAdminTestJSON-306082810 tempest-DeleteServersAdminTestJSON-306082810-project-member] Lock "09e936da-040a-438a-a320-28616de7bb75" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 14.261s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 608.531117] env[60164]: DEBUG nova.network.neutron [req-cf6550cd-cf82-410e-83a1-4db67329afb6 req-fe365b63-d18d-4e24-8704-65fbe155725f service nova] [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 608.550626] env[60164]: ERROR nova.compute.manager [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 83c6510e-dc8c-4f57-b4d1-a3af8164ac57, please check neutron logs for more information. [ 608.550626] env[60164]: ERROR nova.compute.manager Traceback (most recent call last): [ 608.550626] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 608.550626] env[60164]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 608.550626] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 608.550626] env[60164]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 608.550626] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 608.550626] env[60164]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 608.550626] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 608.550626] env[60164]: ERROR nova.compute.manager self.force_reraise() [ 608.550626] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 608.550626] env[60164]: ERROR nova.compute.manager raise self.value [ 608.550626] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 608.550626] env[60164]: ERROR nova.compute.manager updated_port = self._update_port( [ 608.550626] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 608.550626] env[60164]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 608.551181] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 608.551181] env[60164]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 608.551181] env[60164]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 83c6510e-dc8c-4f57-b4d1-a3af8164ac57, please check neutron logs for more information. [ 608.551181] env[60164]: ERROR nova.compute.manager [ 608.551181] env[60164]: Traceback (most recent call last): [ 608.551181] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 608.551181] env[60164]: listener.cb(fileno) [ 608.551181] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 608.551181] env[60164]: result = function(*args, **kwargs) [ 608.551181] env[60164]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 608.551181] env[60164]: return func(*args, **kwargs) [ 608.551181] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 608.551181] env[60164]: raise e [ 608.551181] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 608.551181] env[60164]: nwinfo = self.network_api.allocate_for_instance( [ 608.551181] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 608.551181] env[60164]: created_port_ids = self._update_ports_for_instance( [ 608.551181] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 608.551181] env[60164]: with excutils.save_and_reraise_exception(): [ 608.551181] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 608.551181] env[60164]: self.force_reraise() [ 608.551181] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 608.551181] env[60164]: raise self.value [ 608.551181] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 608.551181] env[60164]: updated_port = self._update_port( [ 608.551181] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 608.551181] env[60164]: _ensure_no_port_binding_failure(port) [ 608.551181] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 608.551181] env[60164]: raise exception.PortBindingFailed(port_id=port['id']) [ 608.552034] env[60164]: nova.exception.PortBindingFailed: Binding failed for port 83c6510e-dc8c-4f57-b4d1-a3af8164ac57, please check neutron logs for more information. [ 608.552034] env[60164]: Removing descriptor: 17 [ 608.552034] env[60164]: ERROR nova.compute.manager [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 83c6510e-dc8c-4f57-b4d1-a3af8164ac57, please check neutron logs for more information. [ 608.552034] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] Traceback (most recent call last): [ 608.552034] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] File "/opt/stack/nova/nova/compute/manager.py", line 2849, in _build_resources [ 608.552034] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] yield resources [ 608.552034] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 608.552034] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] self.driver.spawn(context, instance, image_meta, [ 608.552034] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 608.552034] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 608.552034] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 608.552034] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] vm_ref = self.build_virtual_machine(instance, [ 608.552432] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 608.552432] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] vif_infos = vmwarevif.get_vif_info(self._session, [ 608.552432] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 608.552432] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] for vif in network_info: [ 608.552432] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 608.552432] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] return self._sync_wrapper(fn, *args, **kwargs) [ 608.552432] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 608.552432] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] self.wait() [ 608.552432] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 608.552432] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] self[:] = self._gt.wait() [ 608.552432] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 608.552432] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] return self._exit_event.wait() [ 608.552432] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 608.552432] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] result = hub.switch() [ 608.552956] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 608.552956] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] return self.greenlet.switch() [ 608.552956] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 608.552956] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] result = function(*args, **kwargs) [ 608.552956] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 608.552956] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] return func(*args, **kwargs) [ 608.552956] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 608.552956] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] raise e [ 608.552956] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 608.552956] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] nwinfo = self.network_api.allocate_for_instance( [ 608.552956] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 608.552956] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] created_port_ids = self._update_ports_for_instance( [ 608.552956] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 608.553340] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] with excutils.save_and_reraise_exception(): [ 608.553340] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 608.553340] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] self.force_reraise() [ 608.553340] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 608.553340] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] raise self.value [ 608.553340] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 608.553340] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] updated_port = self._update_port( [ 608.553340] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 608.553340] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] _ensure_no_port_binding_failure(port) [ 608.553340] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 608.553340] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] raise exception.PortBindingFailed(port_id=port['id']) [ 608.553340] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] nova.exception.PortBindingFailed: Binding failed for port 83c6510e-dc8c-4f57-b4d1-a3af8164ac57, please check neutron logs for more information. [ 608.553340] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] [ 608.553763] env[60164]: INFO nova.compute.manager [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] Terminating instance [ 608.561812] env[60164]: DEBUG oslo_concurrency.lockutils [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] Acquiring lock "refresh_cache-3681b35b-c962-4e80-8f9c-df0db2f515e9" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 609.036278] env[60164]: DEBUG nova.network.neutron [req-cf6550cd-cf82-410e-83a1-4db67329afb6 req-fe365b63-d18d-4e24-8704-65fbe155725f service nova] [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 609.046567] env[60164]: DEBUG oslo_concurrency.lockutils [req-cf6550cd-cf82-410e-83a1-4db67329afb6 req-fe365b63-d18d-4e24-8704-65fbe155725f service nova] Releasing lock "refresh_cache-3681b35b-c962-4e80-8f9c-df0db2f515e9" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 609.046957] env[60164]: DEBUG oslo_concurrency.lockutils [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] Acquired lock "refresh_cache-3681b35b-c962-4e80-8f9c-df0db2f515e9" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 609.047708] env[60164]: DEBUG nova.network.neutron [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 609.126174] env[60164]: DEBUG nova.network.neutron [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 609.243043] env[60164]: ERROR nova.compute.manager [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 8d00bd2a-2402-4f54-9624-959f29f21eda, please check neutron logs for more information. [ 609.243043] env[60164]: ERROR nova.compute.manager Traceback (most recent call last): [ 609.243043] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 609.243043] env[60164]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 609.243043] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 609.243043] env[60164]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 609.243043] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 609.243043] env[60164]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 609.243043] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 609.243043] env[60164]: ERROR nova.compute.manager self.force_reraise() [ 609.243043] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 609.243043] env[60164]: ERROR nova.compute.manager raise self.value [ 609.243043] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 609.243043] env[60164]: ERROR nova.compute.manager updated_port = self._update_port( [ 609.243043] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 609.243043] env[60164]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 609.243509] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 609.243509] env[60164]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 609.243509] env[60164]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 8d00bd2a-2402-4f54-9624-959f29f21eda, please check neutron logs for more information. [ 609.243509] env[60164]: ERROR nova.compute.manager [ 609.243509] env[60164]: Traceback (most recent call last): [ 609.243509] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 609.243509] env[60164]: listener.cb(fileno) [ 609.243509] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 609.243509] env[60164]: result = function(*args, **kwargs) [ 609.243509] env[60164]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 609.243509] env[60164]: return func(*args, **kwargs) [ 609.243509] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 609.243509] env[60164]: raise e [ 609.243509] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 609.243509] env[60164]: nwinfo = self.network_api.allocate_for_instance( [ 609.243509] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 609.243509] env[60164]: created_port_ids = self._update_ports_for_instance( [ 609.243509] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 609.243509] env[60164]: with excutils.save_and_reraise_exception(): [ 609.243509] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 609.243509] env[60164]: self.force_reraise() [ 609.243509] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 609.243509] env[60164]: raise self.value [ 609.243509] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 609.243509] env[60164]: updated_port = self._update_port( [ 609.243509] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 609.243509] env[60164]: _ensure_no_port_binding_failure(port) [ 609.243509] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 609.243509] env[60164]: raise exception.PortBindingFailed(port_id=port['id']) [ 609.244230] env[60164]: nova.exception.PortBindingFailed: Binding failed for port 8d00bd2a-2402-4f54-9624-959f29f21eda, please check neutron logs for more information. [ 609.244230] env[60164]: Removing descriptor: 18 [ 609.244230] env[60164]: ERROR nova.compute.manager [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] [instance: dbb50c25-381f-4878-945b-170f2681f2ae] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 8d00bd2a-2402-4f54-9624-959f29f21eda, please check neutron logs for more information. [ 609.244230] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] Traceback (most recent call last): [ 609.244230] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] File "/opt/stack/nova/nova/compute/manager.py", line 2849, in _build_resources [ 609.244230] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] yield resources [ 609.244230] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 609.244230] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] self.driver.spawn(context, instance, image_meta, [ 609.244230] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 609.244230] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] self._vmops.spawn(context, instance, image_meta, injected_files, [ 609.244230] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 609.244230] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] vm_ref = self.build_virtual_machine(instance, [ 609.244524] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 609.244524] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] vif_infos = vmwarevif.get_vif_info(self._session, [ 609.244524] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 609.244524] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] for vif in network_info: [ 609.244524] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 609.244524] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] return self._sync_wrapper(fn, *args, **kwargs) [ 609.244524] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 609.244524] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] self.wait() [ 609.244524] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 609.244524] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] self[:] = self._gt.wait() [ 609.244524] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 609.244524] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] return self._exit_event.wait() [ 609.244524] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 609.244524] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] result = hub.switch() [ 609.244911] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 609.244911] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] return self.greenlet.switch() [ 609.244911] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 609.244911] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] result = function(*args, **kwargs) [ 609.244911] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 609.244911] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] return func(*args, **kwargs) [ 609.244911] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 609.244911] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] raise e [ 609.244911] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 609.244911] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] nwinfo = self.network_api.allocate_for_instance( [ 609.244911] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 609.244911] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] created_port_ids = self._update_ports_for_instance( [ 609.244911] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 609.245247] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] with excutils.save_and_reraise_exception(): [ 609.245247] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 609.245247] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] self.force_reraise() [ 609.245247] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 609.245247] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] raise self.value [ 609.245247] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 609.245247] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] updated_port = self._update_port( [ 609.245247] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 609.245247] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] _ensure_no_port_binding_failure(port) [ 609.245247] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 609.245247] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] raise exception.PortBindingFailed(port_id=port['id']) [ 609.245247] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] nova.exception.PortBindingFailed: Binding failed for port 8d00bd2a-2402-4f54-9624-959f29f21eda, please check neutron logs for more information. [ 609.245247] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] [ 609.246105] env[60164]: INFO nova.compute.manager [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] [instance: dbb50c25-381f-4878-945b-170f2681f2ae] Terminating instance [ 609.246208] env[60164]: DEBUG oslo_concurrency.lockutils [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] Acquiring lock "refresh_cache-dbb50c25-381f-4878-945b-170f2681f2ae" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 609.246286] env[60164]: DEBUG oslo_concurrency.lockutils [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] Acquired lock "refresh_cache-dbb50c25-381f-4878-945b-170f2681f2ae" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 609.246488] env[60164]: DEBUG nova.network.neutron [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] [instance: dbb50c25-381f-4878-945b-170f2681f2ae] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 609.471374] env[60164]: DEBUG nova.network.neutron [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] [instance: dbb50c25-381f-4878-945b-170f2681f2ae] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 609.686019] env[60164]: DEBUG nova.network.neutron [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 609.699129] env[60164]: DEBUG oslo_concurrency.lockutils [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] Releasing lock "refresh_cache-3681b35b-c962-4e80-8f9c-df0db2f515e9" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 609.699129] env[60164]: DEBUG nova.compute.manager [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] Start destroying the instance on the hypervisor. {{(pid=60164) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} [ 609.699129] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] Destroying instance {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 609.699433] env[60164]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-6cf030ac-6245-4b35-9f93-e0411eac9620 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 609.714917] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35ea3243-50ac-430c-acf2-dd1bbaaae4e9 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 609.741294] env[60164]: WARNING nova.virt.vmwareapi.vmops [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 3681b35b-c962-4e80-8f9c-df0db2f515e9 could not be found. [ 609.741656] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] Instance destroyed {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 609.741918] env[60164]: INFO nova.compute.manager [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] Took 0.04 seconds to destroy the instance on the hypervisor. [ 609.742180] env[60164]: DEBUG oslo.service.loopingcall [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60164) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 609.742379] env[60164]: DEBUG nova.compute.manager [-] [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 609.742465] env[60164]: DEBUG nova.network.neutron [-] [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 609.906188] env[60164]: DEBUG nova.network.neutron [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] [instance: dbb50c25-381f-4878-945b-170f2681f2ae] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 609.918783] env[60164]: DEBUG oslo_concurrency.lockutils [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] Releasing lock "refresh_cache-dbb50c25-381f-4878-945b-170f2681f2ae" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 609.919218] env[60164]: DEBUG nova.compute.manager [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] [instance: dbb50c25-381f-4878-945b-170f2681f2ae] Start destroying the instance on the hypervisor. {{(pid=60164) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} [ 609.921742] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] [instance: dbb50c25-381f-4878-945b-170f2681f2ae] Destroying instance {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 609.922996] env[60164]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c4a77acb-57e7-4520-9e80-38a30c468e2e {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 609.932711] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ba2eb21-b2b1-4e5c-88bf-89658b25dd49 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 609.955736] env[60164]: WARNING nova.virt.vmwareapi.vmops [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] [instance: dbb50c25-381f-4878-945b-170f2681f2ae] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance dbb50c25-381f-4878-945b-170f2681f2ae could not be found. [ 609.955736] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] [instance: dbb50c25-381f-4878-945b-170f2681f2ae] Instance destroyed {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 609.955736] env[60164]: INFO nova.compute.manager [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] [instance: dbb50c25-381f-4878-945b-170f2681f2ae] Took 0.04 seconds to destroy the instance on the hypervisor. [ 609.955736] env[60164]: DEBUG oslo.service.loopingcall [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60164) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 609.955736] env[60164]: DEBUG nova.compute.manager [-] [instance: dbb50c25-381f-4878-945b-170f2681f2ae] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 609.956045] env[60164]: DEBUG nova.network.neutron [-] [instance: dbb50c25-381f-4878-945b-170f2681f2ae] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 609.988515] env[60164]: DEBUG nova.network.neutron [-] [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 609.995827] env[60164]: DEBUG nova.network.neutron [-] [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 610.010031] env[60164]: INFO nova.compute.manager [-] [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] Took 0.27 seconds to deallocate network for instance. [ 610.012256] env[60164]: DEBUG nova.compute.claims [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] Aborting claim: {{(pid=60164) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 610.012417] env[60164]: DEBUG oslo_concurrency.lockutils [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.012617] env[60164]: DEBUG oslo_concurrency.lockutils [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.083124] env[60164]: DEBUG nova.network.neutron [-] [instance: dbb50c25-381f-4878-945b-170f2681f2ae] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 610.091488] env[60164]: DEBUG nova.network.neutron [-] [instance: dbb50c25-381f-4878-945b-170f2681f2ae] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 610.103714] env[60164]: INFO nova.compute.manager [-] [instance: dbb50c25-381f-4878-945b-170f2681f2ae] Took 0.15 seconds to deallocate network for instance. [ 610.106382] env[60164]: DEBUG nova.compute.claims [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] [instance: dbb50c25-381f-4878-945b-170f2681f2ae] Aborting claim: {{(pid=60164) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 610.106572] env[60164]: DEBUG oslo_concurrency.lockutils [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.146401] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1bb9088b-cdc1-46d3-805f-ac276e4ab7b8 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 610.155828] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02c4d05e-9f54-4e04-bb8f-5de46087a900 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 610.193378] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18c7d587-daba-403c-9d32-719addefad90 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 610.201344] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a888cd01-4d17-459a-8afe-38b53a1f7bfb {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 610.217664] env[60164]: DEBUG nova.compute.provider_tree [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 610.225978] env[60164]: DEBUG nova.scheduler.client.report [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 610.244734] env[60164]: DEBUG oslo_concurrency.lockutils [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.232s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 610.245374] env[60164]: ERROR nova.compute.manager [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 83c6510e-dc8c-4f57-b4d1-a3af8164ac57, please check neutron logs for more information. [ 610.245374] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] Traceback (most recent call last): [ 610.245374] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 610.245374] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] self.driver.spawn(context, instance, image_meta, [ 610.245374] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 610.245374] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 610.245374] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 610.245374] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] vm_ref = self.build_virtual_machine(instance, [ 610.245374] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 610.245374] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] vif_infos = vmwarevif.get_vif_info(self._session, [ 610.245374] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 610.245754] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] for vif in network_info: [ 610.245754] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 610.245754] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] return self._sync_wrapper(fn, *args, **kwargs) [ 610.245754] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 610.245754] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] self.wait() [ 610.245754] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 610.245754] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] self[:] = self._gt.wait() [ 610.245754] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 610.245754] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] return self._exit_event.wait() [ 610.245754] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 610.245754] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] result = hub.switch() [ 610.245754] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 610.245754] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] return self.greenlet.switch() [ 610.245754] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 610.246330] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] result = function(*args, **kwargs) [ 610.246330] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 610.246330] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] return func(*args, **kwargs) [ 610.246330] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 610.246330] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] raise e [ 610.246330] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 610.246330] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] nwinfo = self.network_api.allocate_for_instance( [ 610.246330] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 610.246330] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] created_port_ids = self._update_ports_for_instance( [ 610.246330] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 610.246330] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] with excutils.save_and_reraise_exception(): [ 610.246330] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 610.246330] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] self.force_reraise() [ 610.246668] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 610.246668] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] raise self.value [ 610.246668] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 610.246668] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] updated_port = self._update_port( [ 610.246668] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 610.246668] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] _ensure_no_port_binding_failure(port) [ 610.246668] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 610.246668] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] raise exception.PortBindingFailed(port_id=port['id']) [ 610.246668] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] nova.exception.PortBindingFailed: Binding failed for port 83c6510e-dc8c-4f57-b4d1-a3af8164ac57, please check neutron logs for more information. [ 610.246668] env[60164]: ERROR nova.compute.manager [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] [ 610.246668] env[60164]: DEBUG nova.compute.utils [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] Binding failed for port 83c6510e-dc8c-4f57-b4d1-a3af8164ac57, please check neutron logs for more information. {{(pid=60164) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 610.247406] env[60164]: DEBUG oslo_concurrency.lockutils [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.141s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.253596] env[60164]: DEBUG nova.compute.manager [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] Build of instance 3681b35b-c962-4e80-8f9c-df0db2f515e9 was re-scheduled: Binding failed for port 83c6510e-dc8c-4f57-b4d1-a3af8164ac57, please check neutron logs for more information. {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 610.254063] env[60164]: DEBUG nova.compute.manager [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] Unplugging VIFs for instance {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} [ 610.254289] env[60164]: DEBUG oslo_concurrency.lockutils [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] Acquiring lock "refresh_cache-3681b35b-c962-4e80-8f9c-df0db2f515e9" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 610.254434] env[60164]: DEBUG oslo_concurrency.lockutils [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] Acquired lock "refresh_cache-3681b35b-c962-4e80-8f9c-df0db2f515e9" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 610.254582] env[60164]: DEBUG nova.network.neutron [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 610.368786] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef8be1e2-d5b0-4430-a42c-bef8af8c131c {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 610.378140] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-982937d2-3d09-48c4-a6a6-e539b6ca641a {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 610.415604] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1da58737-7722-4127-b52d-0259ae998a7f {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 610.419500] env[60164]: DEBUG nova.network.neutron [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 610.428272] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27892b34-17db-458a-a8a3-77d38f74d855 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 610.444832] env[60164]: DEBUG nova.compute.provider_tree [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 610.453557] env[60164]: DEBUG nova.scheduler.client.report [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 610.469964] env[60164]: DEBUG oslo_concurrency.lockutils [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.223s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 610.471204] env[60164]: ERROR nova.compute.manager [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] [instance: dbb50c25-381f-4878-945b-170f2681f2ae] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 8d00bd2a-2402-4f54-9624-959f29f21eda, please check neutron logs for more information. [ 610.471204] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] Traceback (most recent call last): [ 610.471204] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 610.471204] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] self.driver.spawn(context, instance, image_meta, [ 610.471204] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 610.471204] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] self._vmops.spawn(context, instance, image_meta, injected_files, [ 610.471204] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 610.471204] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] vm_ref = self.build_virtual_machine(instance, [ 610.471204] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 610.471204] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] vif_infos = vmwarevif.get_vif_info(self._session, [ 610.471204] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 610.471753] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] for vif in network_info: [ 610.471753] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 610.471753] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] return self._sync_wrapper(fn, *args, **kwargs) [ 610.471753] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 610.471753] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] self.wait() [ 610.471753] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 610.471753] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] self[:] = self._gt.wait() [ 610.471753] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 610.471753] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] return self._exit_event.wait() [ 610.471753] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 610.471753] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] result = hub.switch() [ 610.471753] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 610.471753] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] return self.greenlet.switch() [ 610.471753] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 610.472152] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] result = function(*args, **kwargs) [ 610.472152] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 610.472152] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] return func(*args, **kwargs) [ 610.472152] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 610.472152] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] raise e [ 610.472152] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 610.472152] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] nwinfo = self.network_api.allocate_for_instance( [ 610.472152] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 610.472152] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] created_port_ids = self._update_ports_for_instance( [ 610.472152] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 610.472152] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] with excutils.save_and_reraise_exception(): [ 610.472152] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 610.472152] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] self.force_reraise() [ 610.472459] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 610.472459] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] raise self.value [ 610.472459] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 610.472459] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] updated_port = self._update_port( [ 610.472459] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 610.472459] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] _ensure_no_port_binding_failure(port) [ 610.472459] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 610.472459] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] raise exception.PortBindingFailed(port_id=port['id']) [ 610.472459] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] nova.exception.PortBindingFailed: Binding failed for port 8d00bd2a-2402-4f54-9624-959f29f21eda, please check neutron logs for more information. [ 610.472459] env[60164]: ERROR nova.compute.manager [instance: dbb50c25-381f-4878-945b-170f2681f2ae] [ 610.472459] env[60164]: DEBUG nova.compute.utils [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] [instance: dbb50c25-381f-4878-945b-170f2681f2ae] Binding failed for port 8d00bd2a-2402-4f54-9624-959f29f21eda, please check neutron logs for more information. {{(pid=60164) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 610.474109] env[60164]: DEBUG nova.compute.manager [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] [instance: dbb50c25-381f-4878-945b-170f2681f2ae] Build of instance dbb50c25-381f-4878-945b-170f2681f2ae was re-scheduled: Binding failed for port 8d00bd2a-2402-4f54-9624-959f29f21eda, please check neutron logs for more information. {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 610.474499] env[60164]: DEBUG nova.compute.manager [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] [instance: dbb50c25-381f-4878-945b-170f2681f2ae] Unplugging VIFs for instance {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} [ 610.474920] env[60164]: DEBUG oslo_concurrency.lockutils [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] Acquiring lock "refresh_cache-dbb50c25-381f-4878-945b-170f2681f2ae" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 610.475261] env[60164]: DEBUG oslo_concurrency.lockutils [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] Acquired lock "refresh_cache-dbb50c25-381f-4878-945b-170f2681f2ae" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 610.475382] env[60164]: DEBUG nova.network.neutron [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] [instance: dbb50c25-381f-4878-945b-170f2681f2ae] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 610.578987] env[60164]: DEBUG nova.network.neutron [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] [instance: dbb50c25-381f-4878-945b-170f2681f2ae] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 610.722365] env[60164]: DEBUG nova.network.neutron [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 610.733678] env[60164]: DEBUG oslo_concurrency.lockutils [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] Releasing lock "refresh_cache-3681b35b-c962-4e80-8f9c-df0db2f515e9" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 610.733883] env[60164]: DEBUG nova.compute.manager [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2984}} [ 610.734012] env[60164]: DEBUG nova.compute.manager [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 610.734249] env[60164]: DEBUG nova.network.neutron [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 610.828801] env[60164]: DEBUG nova.network.neutron [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 610.839196] env[60164]: DEBUG nova.network.neutron [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 610.856545] env[60164]: INFO nova.compute.manager [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] [instance: 3681b35b-c962-4e80-8f9c-df0db2f515e9] Took 0.12 seconds to deallocate network for instance. [ 610.923431] env[60164]: ERROR nova.compute.manager [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 25193a61-bed4-4d25-8d37-d6270569cf1a, please check neutron logs for more information. [ 610.923431] env[60164]: ERROR nova.compute.manager Traceback (most recent call last): [ 610.923431] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 610.923431] env[60164]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 610.923431] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 610.923431] env[60164]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 610.923431] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 610.923431] env[60164]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 610.923431] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 610.923431] env[60164]: ERROR nova.compute.manager self.force_reraise() [ 610.923431] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 610.923431] env[60164]: ERROR nova.compute.manager raise self.value [ 610.923431] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 610.923431] env[60164]: ERROR nova.compute.manager updated_port = self._update_port( [ 610.923431] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 610.923431] env[60164]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 610.923886] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 610.923886] env[60164]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 610.923886] env[60164]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 25193a61-bed4-4d25-8d37-d6270569cf1a, please check neutron logs for more information. [ 610.923886] env[60164]: ERROR nova.compute.manager [ 610.923886] env[60164]: Traceback (most recent call last): [ 610.923886] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 610.923886] env[60164]: listener.cb(fileno) [ 610.923886] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 610.923886] env[60164]: result = function(*args, **kwargs) [ 610.923886] env[60164]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 610.923886] env[60164]: return func(*args, **kwargs) [ 610.923886] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 610.923886] env[60164]: raise e [ 610.923886] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 610.923886] env[60164]: nwinfo = self.network_api.allocate_for_instance( [ 610.923886] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 610.923886] env[60164]: created_port_ids = self._update_ports_for_instance( [ 610.923886] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 610.923886] env[60164]: with excutils.save_and_reraise_exception(): [ 610.923886] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 610.923886] env[60164]: self.force_reraise() [ 610.923886] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 610.923886] env[60164]: raise self.value [ 610.923886] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 610.923886] env[60164]: updated_port = self._update_port( [ 610.923886] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 610.923886] env[60164]: _ensure_no_port_binding_failure(port) [ 610.923886] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 610.923886] env[60164]: raise exception.PortBindingFailed(port_id=port['id']) [ 610.924656] env[60164]: nova.exception.PortBindingFailed: Binding failed for port 25193a61-bed4-4d25-8d37-d6270569cf1a, please check neutron logs for more information. [ 610.924656] env[60164]: Removing descriptor: 19 [ 610.924656] env[60164]: ERROR nova.compute.manager [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 25193a61-bed4-4d25-8d37-d6270569cf1a, please check neutron logs for more information. [ 610.924656] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] Traceback (most recent call last): [ 610.924656] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] File "/opt/stack/nova/nova/compute/manager.py", line 2849, in _build_resources [ 610.924656] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] yield resources [ 610.924656] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 610.924656] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] self.driver.spawn(context, instance, image_meta, [ 610.924656] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 610.924656] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] self._vmops.spawn(context, instance, image_meta, injected_files, [ 610.924656] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 610.924656] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] vm_ref = self.build_virtual_machine(instance, [ 610.925144] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 610.925144] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] vif_infos = vmwarevif.get_vif_info(self._session, [ 610.925144] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 610.925144] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] for vif in network_info: [ 610.925144] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 610.925144] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] return self._sync_wrapper(fn, *args, **kwargs) [ 610.925144] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 610.925144] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] self.wait() [ 610.925144] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 610.925144] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] self[:] = self._gt.wait() [ 610.925144] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 610.925144] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] return self._exit_event.wait() [ 610.925144] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 610.925144] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] result = hub.switch() [ 610.925520] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 610.925520] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] return self.greenlet.switch() [ 610.925520] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 610.925520] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] result = function(*args, **kwargs) [ 610.925520] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 610.925520] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] return func(*args, **kwargs) [ 610.925520] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 610.925520] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] raise e [ 610.925520] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 610.925520] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] nwinfo = self.network_api.allocate_for_instance( [ 610.925520] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 610.925520] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] created_port_ids = self._update_ports_for_instance( [ 610.925520] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 610.925865] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] with excutils.save_and_reraise_exception(): [ 610.925865] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 610.925865] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] self.force_reraise() [ 610.925865] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 610.925865] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] raise self.value [ 610.925865] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 610.925865] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] updated_port = self._update_port( [ 610.925865] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 610.925865] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] _ensure_no_port_binding_failure(port) [ 610.925865] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 610.925865] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] raise exception.PortBindingFailed(port_id=port['id']) [ 610.925865] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] nova.exception.PortBindingFailed: Binding failed for port 25193a61-bed4-4d25-8d37-d6270569cf1a, please check neutron logs for more information. [ 610.925865] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] [ 610.927764] env[60164]: INFO nova.compute.manager [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] Terminating instance [ 610.931296] env[60164]: DEBUG oslo_concurrency.lockutils [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] Acquiring lock "refresh_cache-4dbfaeea-229a-4ed1-afb2-bd8e167a1385" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 610.931296] env[60164]: DEBUG oslo_concurrency.lockutils [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] Acquired lock "refresh_cache-4dbfaeea-229a-4ed1-afb2-bd8e167a1385" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 610.931296] env[60164]: DEBUG nova.network.neutron [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 610.944746] env[60164]: DEBUG nova.network.neutron [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] [instance: dbb50c25-381f-4878-945b-170f2681f2ae] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 610.959031] env[60164]: DEBUG oslo_concurrency.lockutils [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] Releasing lock "refresh_cache-dbb50c25-381f-4878-945b-170f2681f2ae" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 610.959208] env[60164]: DEBUG nova.compute.manager [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2984}} [ 610.961798] env[60164]: DEBUG nova.compute.manager [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] [instance: dbb50c25-381f-4878-945b-170f2681f2ae] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 610.961798] env[60164]: DEBUG nova.network.neutron [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] [instance: dbb50c25-381f-4878-945b-170f2681f2ae] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 610.988019] env[60164]: INFO nova.scheduler.client.report [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] Deleted allocations for instance 3681b35b-c962-4e80-8f9c-df0db2f515e9 [ 611.006816] env[60164]: DEBUG nova.network.neutron [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 611.018191] env[60164]: DEBUG oslo_concurrency.lockutils [None req-f5c1b416-d9aa-42f4-ad1c-7382d98aa4e0 tempest-TenantUsagesTestJSON-2036990172 tempest-TenantUsagesTestJSON-2036990172-project-member] Lock "3681b35b-c962-4e80-8f9c-df0db2f515e9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 14.846s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 611.028773] env[60164]: DEBUG nova.network.neutron [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] [instance: dbb50c25-381f-4878-945b-170f2681f2ae] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 611.039182] env[60164]: DEBUG nova.network.neutron [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] [instance: dbb50c25-381f-4878-945b-170f2681f2ae] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 611.050378] env[60164]: INFO nova.compute.manager [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] [instance: dbb50c25-381f-4878-945b-170f2681f2ae] Took 0.09 seconds to deallocate network for instance. [ 611.163792] env[60164]: INFO nova.scheduler.client.report [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] Deleted allocations for instance dbb50c25-381f-4878-945b-170f2681f2ae [ 611.197210] env[60164]: DEBUG oslo_concurrency.lockutils [None req-733d259c-18d3-48c3-94b3-b8409db5e48a tempest-MigrationsAdminTest-1656569401 tempest-MigrationsAdminTest-1656569401-project-member] Lock "dbb50c25-381f-4878-945b-170f2681f2ae" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 15.001s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 611.858120] env[60164]: DEBUG nova.network.neutron [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 611.867376] env[60164]: DEBUG oslo_concurrency.lockutils [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] Releasing lock "refresh_cache-4dbfaeea-229a-4ed1-afb2-bd8e167a1385" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 611.872104] env[60164]: DEBUG nova.compute.manager [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] Start destroying the instance on the hypervisor. {{(pid=60164) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} [ 611.872104] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] Destroying instance {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 611.872104] env[60164]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-8d55a410-28eb-450b-9742-bc5208457abc {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 611.888692] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9513e077-dec8-4d26-b2c4-34a25b787696 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 611.915949] env[60164]: WARNING nova.virt.vmwareapi.vmops [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 4dbfaeea-229a-4ed1-afb2-bd8e167a1385 could not be found. [ 611.916203] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] Instance destroyed {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 611.916383] env[60164]: INFO nova.compute.manager [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] Took 0.05 seconds to destroy the instance on the hypervisor. [ 611.916893] env[60164]: DEBUG oslo.service.loopingcall [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60164) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 611.917169] env[60164]: DEBUG nova.compute.manager [-] [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 611.917258] env[60164]: DEBUG nova.network.neutron [-] [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 612.018415] env[60164]: DEBUG nova.network.neutron [-] [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 612.026064] env[60164]: DEBUG nova.network.neutron [-] [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 612.040040] env[60164]: INFO nova.compute.manager [-] [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] Took 0.12 seconds to deallocate network for instance. [ 612.043591] env[60164]: DEBUG nova.compute.claims [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] Aborting claim: {{(pid=60164) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 612.043591] env[60164]: DEBUG oslo_concurrency.lockutils [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 612.043591] env[60164]: DEBUG oslo_concurrency.lockutils [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 612.153431] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb6e1840-8b46-4ebc-814d-e49e38878532 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 612.164344] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0873c939-357b-4c8f-b695-ea676fd48257 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 612.205104] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1586f91-5fba-4e8b-a0e3-3811aaf95e6a {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 612.215522] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bed3e309-0eb4-4d5b-a12c-b4541e30bd86 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 612.235389] env[60164]: DEBUG nova.compute.provider_tree [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 612.254182] env[60164]: DEBUG nova.scheduler.client.report [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 612.279944] env[60164]: DEBUG oslo_concurrency.lockutils [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.234s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 612.279944] env[60164]: ERROR nova.compute.manager [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 25193a61-bed4-4d25-8d37-d6270569cf1a, please check neutron logs for more information. [ 612.279944] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] Traceback (most recent call last): [ 612.279944] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 612.279944] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] self.driver.spawn(context, instance, image_meta, [ 612.279944] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 612.279944] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] self._vmops.spawn(context, instance, image_meta, injected_files, [ 612.279944] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 612.279944] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] vm_ref = self.build_virtual_machine(instance, [ 612.280324] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 612.280324] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] vif_infos = vmwarevif.get_vif_info(self._session, [ 612.280324] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 612.280324] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] for vif in network_info: [ 612.280324] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 612.280324] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] return self._sync_wrapper(fn, *args, **kwargs) [ 612.280324] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 612.280324] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] self.wait() [ 612.280324] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 612.280324] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] self[:] = self._gt.wait() [ 612.280324] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 612.280324] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] return self._exit_event.wait() [ 612.280324] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 612.280324] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] result = hub.switch() [ 612.280798] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 612.280798] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] return self.greenlet.switch() [ 612.280798] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 612.280798] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] result = function(*args, **kwargs) [ 612.280798] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 612.280798] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] return func(*args, **kwargs) [ 612.280798] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 612.280798] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] raise e [ 612.280798] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 612.280798] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] nwinfo = self.network_api.allocate_for_instance( [ 612.280798] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 612.280798] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] created_port_ids = self._update_ports_for_instance( [ 612.280798] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 612.281160] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] with excutils.save_and_reraise_exception(): [ 612.281160] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 612.281160] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] self.force_reraise() [ 612.281160] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 612.281160] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] raise self.value [ 612.281160] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 612.281160] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] updated_port = self._update_port( [ 612.281160] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 612.281160] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] _ensure_no_port_binding_failure(port) [ 612.281160] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 612.281160] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] raise exception.PortBindingFailed(port_id=port['id']) [ 612.281160] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] nova.exception.PortBindingFailed: Binding failed for port 25193a61-bed4-4d25-8d37-d6270569cf1a, please check neutron logs for more information. [ 612.281160] env[60164]: ERROR nova.compute.manager [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] [ 612.281531] env[60164]: DEBUG nova.compute.utils [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] Binding failed for port 25193a61-bed4-4d25-8d37-d6270569cf1a, please check neutron logs for more information. {{(pid=60164) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 612.281531] env[60164]: DEBUG nova.compute.manager [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] Build of instance 4dbfaeea-229a-4ed1-afb2-bd8e167a1385 was re-scheduled: Binding failed for port 25193a61-bed4-4d25-8d37-d6270569cf1a, please check neutron logs for more information. {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 612.281531] env[60164]: DEBUG nova.compute.manager [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] Unplugging VIFs for instance {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} [ 612.281663] env[60164]: DEBUG oslo_concurrency.lockutils [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] Acquiring lock "refresh_cache-4dbfaeea-229a-4ed1-afb2-bd8e167a1385" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 612.281819] env[60164]: DEBUG oslo_concurrency.lockutils [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] Acquired lock "refresh_cache-4dbfaeea-229a-4ed1-afb2-bd8e167a1385" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 612.281917] env[60164]: DEBUG nova.network.neutron [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 612.375336] env[60164]: DEBUG nova.network.neutron [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 613.000263] env[60164]: DEBUG nova.network.neutron [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 613.013497] env[60164]: DEBUG oslo_concurrency.lockutils [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] Releasing lock "refresh_cache-4dbfaeea-229a-4ed1-afb2-bd8e167a1385" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 613.014056] env[60164]: DEBUG nova.compute.manager [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2984}} [ 613.014325] env[60164]: DEBUG nova.compute.manager [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 613.014691] env[60164]: DEBUG nova.network.neutron [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 613.081534] env[60164]: DEBUG nova.network.neutron [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 613.095830] env[60164]: DEBUG nova.network.neutron [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 613.109435] env[60164]: INFO nova.compute.manager [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] [instance: 4dbfaeea-229a-4ed1-afb2-bd8e167a1385] Took 0.09 seconds to deallocate network for instance. [ 613.226540] env[60164]: INFO nova.scheduler.client.report [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] Deleted allocations for instance 4dbfaeea-229a-4ed1-afb2-bd8e167a1385 [ 613.258667] env[60164]: DEBUG oslo_concurrency.lockutils [None req-97f64e23-a856-4e36-b538-6171155631ac tempest-ImagesTestJSON-1759893354 tempest-ImagesTestJSON-1759893354-project-member] Lock "4dbfaeea-229a-4ed1-afb2-bd8e167a1385" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 15.037s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 613.462194] env[60164]: ERROR nova.compute.manager [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 26abace3-118f-40c6-882e-a81b73e76917, please check neutron logs for more information. [ 613.462194] env[60164]: ERROR nova.compute.manager Traceback (most recent call last): [ 613.462194] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 613.462194] env[60164]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 613.462194] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 613.462194] env[60164]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 613.462194] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 613.462194] env[60164]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 613.462194] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 613.462194] env[60164]: ERROR nova.compute.manager self.force_reraise() [ 613.462194] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 613.462194] env[60164]: ERROR nova.compute.manager raise self.value [ 613.462194] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 613.462194] env[60164]: ERROR nova.compute.manager updated_port = self._update_port( [ 613.462194] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 613.462194] env[60164]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 613.462609] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 613.462609] env[60164]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 613.462609] env[60164]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 26abace3-118f-40c6-882e-a81b73e76917, please check neutron logs for more information. [ 613.462609] env[60164]: ERROR nova.compute.manager [ 613.462609] env[60164]: Traceback (most recent call last): [ 613.462609] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 613.462609] env[60164]: listener.cb(fileno) [ 613.462609] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 613.462609] env[60164]: result = function(*args, **kwargs) [ 613.462609] env[60164]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 613.462609] env[60164]: return func(*args, **kwargs) [ 613.462609] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 613.462609] env[60164]: raise e [ 613.462609] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 613.462609] env[60164]: nwinfo = self.network_api.allocate_for_instance( [ 613.462609] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 613.462609] env[60164]: created_port_ids = self._update_ports_for_instance( [ 613.462609] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 613.462609] env[60164]: with excutils.save_and_reraise_exception(): [ 613.462609] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 613.462609] env[60164]: self.force_reraise() [ 613.462609] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 613.462609] env[60164]: raise self.value [ 613.462609] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 613.462609] env[60164]: updated_port = self._update_port( [ 613.462609] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 613.462609] env[60164]: _ensure_no_port_binding_failure(port) [ 613.462609] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 613.462609] env[60164]: raise exception.PortBindingFailed(port_id=port['id']) [ 613.463322] env[60164]: nova.exception.PortBindingFailed: Binding failed for port 26abace3-118f-40c6-882e-a81b73e76917, please check neutron logs for more information. [ 613.463322] env[60164]: Removing descriptor: 20 [ 613.463322] env[60164]: ERROR nova.compute.manager [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 26abace3-118f-40c6-882e-a81b73e76917, please check neutron logs for more information. [ 613.463322] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] Traceback (most recent call last): [ 613.463322] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] File "/opt/stack/nova/nova/compute/manager.py", line 2849, in _build_resources [ 613.463322] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] yield resources [ 613.463322] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 613.463322] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] self.driver.spawn(context, instance, image_meta, [ 613.463322] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 613.463322] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 613.463322] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 613.463322] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] vm_ref = self.build_virtual_machine(instance, [ 613.463796] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 613.463796] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] vif_infos = vmwarevif.get_vif_info(self._session, [ 613.463796] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 613.463796] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] for vif in network_info: [ 613.463796] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 613.463796] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] return self._sync_wrapper(fn, *args, **kwargs) [ 613.463796] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 613.463796] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] self.wait() [ 613.463796] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 613.463796] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] self[:] = self._gt.wait() [ 613.463796] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 613.463796] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] return self._exit_event.wait() [ 613.463796] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 613.463796] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] result = hub.switch() [ 613.465372] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 613.465372] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] return self.greenlet.switch() [ 613.465372] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 613.465372] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] result = function(*args, **kwargs) [ 613.465372] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 613.465372] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] return func(*args, **kwargs) [ 613.465372] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 613.465372] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] raise e [ 613.465372] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 613.465372] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] nwinfo = self.network_api.allocate_for_instance( [ 613.465372] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 613.465372] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] created_port_ids = self._update_ports_for_instance( [ 613.465372] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 613.465700] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] with excutils.save_and_reraise_exception(): [ 613.465700] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 613.465700] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] self.force_reraise() [ 613.465700] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 613.465700] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] raise self.value [ 613.465700] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 613.465700] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] updated_port = self._update_port( [ 613.465700] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 613.465700] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] _ensure_no_port_binding_failure(port) [ 613.465700] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 613.465700] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] raise exception.PortBindingFailed(port_id=port['id']) [ 613.465700] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] nova.exception.PortBindingFailed: Binding failed for port 26abace3-118f-40c6-882e-a81b73e76917, please check neutron logs for more information. [ 613.465700] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] [ 613.466289] env[60164]: INFO nova.compute.manager [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] Terminating instance [ 613.467682] env[60164]: DEBUG oslo_concurrency.lockutils [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] Acquiring lock "refresh_cache-93b26227-ad64-4343-aed9-ba6622aaf83e" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 613.467910] env[60164]: DEBUG oslo_concurrency.lockutils [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] Acquired lock "refresh_cache-93b26227-ad64-4343-aed9-ba6622aaf83e" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 613.468167] env[60164]: DEBUG nova.network.neutron [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 613.545837] env[60164]: DEBUG nova.network.neutron [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 614.034825] env[60164]: DEBUG nova.network.neutron [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 614.052529] env[60164]: DEBUG oslo_concurrency.lockutils [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] Releasing lock "refresh_cache-93b26227-ad64-4343-aed9-ba6622aaf83e" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 614.053011] env[60164]: DEBUG nova.compute.manager [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] Start destroying the instance on the hypervisor. {{(pid=60164) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} [ 614.053214] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] Destroying instance {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 614.053753] env[60164]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-fe89e357-35ca-430f-9bf3-54b8a8837651 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 614.066902] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61f18ce7-d4b2-4c81-b7ac-bf1f301f1be0 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 614.094199] env[60164]: WARNING nova.virt.vmwareapi.vmops [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 93b26227-ad64-4343-aed9-ba6622aaf83e could not be found. [ 614.094433] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] Instance destroyed {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 614.094609] env[60164]: INFO nova.compute.manager [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] Took 0.04 seconds to destroy the instance on the hypervisor. [ 614.094850] env[60164]: DEBUG oslo.service.loopingcall [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60164) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 614.095076] env[60164]: DEBUG nova.compute.manager [-] [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 614.095288] env[60164]: DEBUG nova.network.neutron [-] [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 614.323973] env[60164]: DEBUG nova.network.neutron [-] [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 614.339552] env[60164]: DEBUG nova.network.neutron [-] [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 614.354296] env[60164]: INFO nova.compute.manager [-] [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] Took 0.26 seconds to deallocate network for instance. [ 614.358532] env[60164]: DEBUG nova.compute.claims [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] Aborting claim: {{(pid=60164) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 614.360575] env[60164]: DEBUG oslo_concurrency.lockutils [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 614.360575] env[60164]: DEBUG oslo_concurrency.lockutils [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 614.481214] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01350dc9-dd63-4c4a-96c1-208d933d2843 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 614.493750] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01408c6d-d5e2-48f4-aeca-9a7c975e3856 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 614.532348] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e52bf407-b943-4a67-be4c-928e6b7f8beb {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 614.539343] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49276680-4005-487a-ba42-a2355daa3ca6 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 614.562033] env[60164]: DEBUG nova.compute.provider_tree [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 614.572666] env[60164]: DEBUG nova.scheduler.client.report [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 614.586180] env[60164]: DEBUG oslo_concurrency.lockutils [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.227s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 614.586837] env[60164]: ERROR nova.compute.manager [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 26abace3-118f-40c6-882e-a81b73e76917, please check neutron logs for more information. [ 614.586837] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] Traceback (most recent call last): [ 614.586837] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 614.586837] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] self.driver.spawn(context, instance, image_meta, [ 614.586837] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 614.586837] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 614.586837] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 614.586837] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] vm_ref = self.build_virtual_machine(instance, [ 614.586837] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 614.586837] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] vif_infos = vmwarevif.get_vif_info(self._session, [ 614.586837] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 614.587734] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] for vif in network_info: [ 614.587734] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 614.587734] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] return self._sync_wrapper(fn, *args, **kwargs) [ 614.587734] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 614.587734] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] self.wait() [ 614.587734] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 614.587734] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] self[:] = self._gt.wait() [ 614.587734] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 614.587734] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] return self._exit_event.wait() [ 614.587734] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 614.587734] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] result = hub.switch() [ 614.587734] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 614.587734] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] return self.greenlet.switch() [ 614.587734] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 614.589973] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] result = function(*args, **kwargs) [ 614.589973] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 614.589973] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] return func(*args, **kwargs) [ 614.589973] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 614.589973] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] raise e [ 614.589973] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 614.589973] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] nwinfo = self.network_api.allocate_for_instance( [ 614.589973] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 614.589973] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] created_port_ids = self._update_ports_for_instance( [ 614.589973] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 614.589973] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] with excutils.save_and_reraise_exception(): [ 614.589973] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 614.589973] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] self.force_reraise() [ 614.590341] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 614.590341] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] raise self.value [ 614.590341] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 614.590341] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] updated_port = self._update_port( [ 614.590341] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 614.590341] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] _ensure_no_port_binding_failure(port) [ 614.590341] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 614.590341] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] raise exception.PortBindingFailed(port_id=port['id']) [ 614.590341] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] nova.exception.PortBindingFailed: Binding failed for port 26abace3-118f-40c6-882e-a81b73e76917, please check neutron logs for more information. [ 614.590341] env[60164]: ERROR nova.compute.manager [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] [ 614.590591] env[60164]: DEBUG nova.compute.utils [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] Binding failed for port 26abace3-118f-40c6-882e-a81b73e76917, please check neutron logs for more information. {{(pid=60164) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 614.590591] env[60164]: DEBUG nova.compute.manager [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] Build of instance 93b26227-ad64-4343-aed9-ba6622aaf83e was re-scheduled: Binding failed for port 26abace3-118f-40c6-882e-a81b73e76917, please check neutron logs for more information. {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 614.590591] env[60164]: DEBUG nova.compute.manager [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] Unplugging VIFs for instance {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} [ 614.590591] env[60164]: DEBUG oslo_concurrency.lockutils [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] Acquiring lock "refresh_cache-93b26227-ad64-4343-aed9-ba6622aaf83e" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 614.590761] env[60164]: DEBUG oslo_concurrency.lockutils [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] Acquired lock "refresh_cache-93b26227-ad64-4343-aed9-ba6622aaf83e" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 614.590793] env[60164]: DEBUG nova.network.neutron [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 614.652219] env[60164]: DEBUG nova.network.neutron [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 615.198329] env[60164]: DEBUG nova.network.neutron [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 615.212296] env[60164]: DEBUG oslo_concurrency.lockutils [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] Releasing lock "refresh_cache-93b26227-ad64-4343-aed9-ba6622aaf83e" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 615.212519] env[60164]: DEBUG nova.compute.manager [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2984}} [ 615.212690] env[60164]: DEBUG nova.compute.manager [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 615.212848] env[60164]: DEBUG nova.network.neutron [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 615.261527] env[60164]: DEBUG nova.network.neutron [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 615.274941] env[60164]: DEBUG nova.network.neutron [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 615.295837] env[60164]: INFO nova.compute.manager [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] [instance: 93b26227-ad64-4343-aed9-ba6622aaf83e] Took 0.08 seconds to deallocate network for instance. [ 615.431021] env[60164]: INFO nova.scheduler.client.report [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] Deleted allocations for instance 93b26227-ad64-4343-aed9-ba6622aaf83e [ 615.458156] env[60164]: DEBUG oslo_concurrency.lockutils [None req-b2a35850-14c6-4f74-8a9b-fc0caf713225 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501 tempest-FloatingIPsAssociationNegativeTestJSON-1043172501-project-member] Lock "93b26227-ad64-4343-aed9-ba6622aaf83e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 14.635s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 666.752876] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 666.753135] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 666.776274] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 666.776517] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 666.779383] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 666.779539] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60164) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10408}} [ 666.892169] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 666.892386] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 667.887689] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 667.887946] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Starting heal instance info cache {{(pid=60164) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9789}} [ 667.887946] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Rebuilding the list of instances to heal {{(pid=60164) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9793}} [ 667.900676] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Didn't find any instances for network info cache update. {{(pid=60164) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9875}} [ 667.900676] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 667.900676] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 667.910996] env[60164]: DEBUG oslo_concurrency.lockutils [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 667.911255] env[60164]: DEBUG oslo_concurrency.lockutils [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 667.911418] env[60164]: DEBUG oslo_concurrency.lockutils [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 667.911573] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60164) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 667.912664] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63199108-f260-47ca-a767-94604e6f20d2 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 667.923488] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-282bb9a1-f92c-42f2-a7f7-22559da331ba {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 667.942930] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33eb93d2-9718-469f-8556-e5bf11da8fef {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 667.951310] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-855abb6e-86cc-47fe-8ade-bd7b4685a67f {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 667.987232] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181396MB free_disk=138GB free_vcpus=48 pci_devices=None {{(pid=60164) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 667.987499] env[60164]: DEBUG oslo_concurrency.lockutils [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 667.987768] env[60164]: DEBUG oslo_concurrency.lockutils [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 668.049801] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=60164) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 668.050399] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=60164) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 668.073424] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bfb13788-e7d8-42bc-a7e5-512dc7db981d {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 668.084095] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3489e31f-9e88-4c98-9706-07f2e95c05e4 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 668.119786] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4cf4998-794b-4972-9ee5-7bd61309326a {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 668.128684] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43ec4903-c962-499d-8fd0-bceba32ecdf8 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 668.143641] env[60164]: DEBUG nova.compute.provider_tree [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Updating inventory in ProviderTree for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 138, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 668.176560] env[60164]: ERROR nova.scheduler.client.report [None req-ed156dec-397c-455b-9740-451d876eb328 None None] [req-aab53e08-2ba1-4a8f-bd9b-5f19487f7f8b] Failed to update inventory to [{'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 138, 'step_size': 1, 'allocation_ratio': 1.0}}] for resource provider with UUID ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f. Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict ", "code": "placement.concurrent_update", "request_id": "req-aab53e08-2ba1-4a8f-bd9b-5f19487f7f8b"}]} [ 668.199588] env[60164]: DEBUG nova.scheduler.client.report [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Refreshing inventories for resource provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 668.225062] env[60164]: DEBUG nova.scheduler.client.report [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Updating ProviderTree inventory for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 668.225247] env[60164]: DEBUG nova.compute.provider_tree [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Updating inventory in ProviderTree for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 668.240071] env[60164]: DEBUG nova.scheduler.client.report [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Refreshing aggregate associations for resource provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f, aggregates: e7c3f152-76c7-48ff-9fd5-b0ec1a295503 {{(pid=60164) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 668.260772] env[60164]: DEBUG nova.scheduler.client.report [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Refreshing trait associations for resource provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f, traits: COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NET_ATTACH_INTERFACE {{(pid=60164) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 668.307329] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0a79b5f-c027-4582-ab49-f7f2e778f382 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 668.315990] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3bd67156-f2a7-4ebd-9b7a-5f38801f014d {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 668.349540] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b30f6b1-0c62-4a21-93e5-e55424b43640 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 668.357304] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f80014e7-5ae5-4f33-b81e-304ad4dd957c {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 668.373436] env[60164]: DEBUG nova.compute.provider_tree [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Updating inventory in ProviderTree for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 138, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 668.410505] env[60164]: DEBUG nova.scheduler.client.report [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Updated inventory for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f with generation 12 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 138, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 668.410768] env[60164]: DEBUG nova.compute.provider_tree [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Updating resource provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f generation from 12 to 13 during operation: update_inventory {{(pid=60164) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 668.410936] env[60164]: DEBUG nova.compute.provider_tree [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Updating inventory in ProviderTree for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 138, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 668.427267] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60164) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 668.427604] env[60164]: DEBUG oslo_concurrency.lockutils [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.440s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 677.630930] env[60164]: DEBUG oslo_concurrency.lockutils [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Acquiring lock "e46ad9d2-d215-4205-b0c1-44726b08cb45" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 677.631307] env[60164]: DEBUG oslo_concurrency.lockutils [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Lock "e46ad9d2-d215-4205-b0c1-44726b08cb45" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 677.650444] env[60164]: DEBUG nova.compute.manager [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] Starting instance... {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} [ 677.726140] env[60164]: DEBUG oslo_concurrency.lockutils [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 677.726140] env[60164]: DEBUG oslo_concurrency.lockutils [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 677.729694] env[60164]: INFO nova.compute.claims [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 677.835140] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1606fd5a-78b8-4e20-a6af-a74e32f6914e {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 677.844157] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-582dbdb0-7c2b-4916-b350-9563ac5d9ef7 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 677.900265] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f44000a5-2143-4cd0-af6d-c68a8b85bc09 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 677.908169] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a12cfc0-a3d8-4a5f-b55d-852e16b2d0c8 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 677.922233] env[60164]: DEBUG nova.compute.provider_tree [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Updating inventory in ProviderTree for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 677.962077] env[60164]: ERROR nova.scheduler.client.report [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [req-3a611bfd-b589-47ee-97c2-4c96749d8b6a] Failed to update inventory to [{'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}}] for resource provider with UUID ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f. Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict ", "code": "placement.concurrent_update", "request_id": "req-3a611bfd-b589-47ee-97c2-4c96749d8b6a"}]} [ 677.980172] env[60164]: DEBUG nova.scheduler.client.report [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Refreshing inventories for resource provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 678.009010] env[60164]: DEBUG nova.scheduler.client.report [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Updating ProviderTree inventory for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 138, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 678.009010] env[60164]: DEBUG nova.compute.provider_tree [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Updating inventory in ProviderTree for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 138, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 678.015025] env[60164]: DEBUG oslo_concurrency.lockutils [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Acquiring lock "7466dfd3-8756-40eb-91fd-c87f16b627ef" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 678.015025] env[60164]: DEBUG oslo_concurrency.lockutils [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Lock "7466dfd3-8756-40eb-91fd-c87f16b627ef" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 678.026289] env[60164]: DEBUG nova.compute.manager [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] Starting instance... {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} [ 678.033032] env[60164]: DEBUG nova.scheduler.client.report [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Refreshing aggregate associations for resource provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f, aggregates: None {{(pid=60164) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 678.068137] env[60164]: DEBUG nova.scheduler.client.report [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Refreshing trait associations for resource provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f, traits: COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NET_ATTACH_INTERFACE {{(pid=60164) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 678.093086] env[60164]: DEBUG oslo_concurrency.lockutils [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 678.135868] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74dfbd95-4a56-4da8-8616-4a42ba055a8d {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 678.148594] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3cd2f7d7-7b00-4f3d-a91a-d082ddddd47c {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 678.196941] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cde1441d-ae7d-4c20-957f-cc49b9f9ecb8 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 678.209756] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4a7782f-080f-470e-a129-2cb5886f1257 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 678.222559] env[60164]: DEBUG nova.compute.provider_tree [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Updating inventory in ProviderTree for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 678.270359] env[60164]: DEBUG nova.scheduler.client.report [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Updated inventory for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f with generation 16 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 678.270359] env[60164]: DEBUG nova.compute.provider_tree [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Updating resource provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f generation from 16 to 17 during operation: update_inventory {{(pid=60164) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 678.270607] env[60164]: DEBUG nova.compute.provider_tree [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Updating inventory in ProviderTree for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 678.291668] env[60164]: DEBUG oslo_concurrency.lockutils [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.566s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 678.292073] env[60164]: DEBUG nova.compute.manager [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] Start building networks asynchronously for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} [ 678.297936] env[60164]: DEBUG oslo_concurrency.lockutils [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.205s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 678.299779] env[60164]: INFO nova.compute.claims [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 678.348293] env[60164]: DEBUG nova.compute.utils [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Using /dev/sd instead of None {{(pid=60164) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 678.348773] env[60164]: DEBUG nova.compute.manager [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] Allocating IP information in the background. {{(pid=60164) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 678.349075] env[60164]: DEBUG nova.network.neutron [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] allocate_for_instance() {{(pid=60164) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1143}} [ 678.366138] env[60164]: DEBUG nova.compute.manager [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] Start building block device mappings for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} [ 678.437652] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d688f2e-a007-464d-94c7-896e26a9d1fc {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 678.446037] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e797eb50-3bc1-4d5b-b5d0-8952bf1e69f5 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 678.481018] env[60164]: DEBUG nova.compute.manager [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] Start spawning the instance on the hypervisor. {{(pid=60164) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} [ 678.483570] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a77422cb-503c-4fc7-a398-7977c431e2cc {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 678.491970] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-768c50cc-9db6-43a8-bf86-958d4e13d506 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 678.505972] env[60164]: DEBUG nova.compute.provider_tree [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 678.517460] env[60164]: DEBUG nova.scheduler.client.report [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 678.524039] env[60164]: DEBUG nova.virt.hardware [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-15T19:01:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-15T19:01:11Z,direct_url=,disk_format='vmdk',id=1618eb55-f00d-42a5-b978-e81e57855fb4,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='4c19b57a5b764a6480c1f6e2a9cdd5e4',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-15T19:01:12Z,virtual_size=,visibility=), allow threads: False {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 678.524039] env[60164]: DEBUG nova.virt.hardware [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Flavor limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 678.524039] env[60164]: DEBUG nova.virt.hardware [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Image limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 678.524319] env[60164]: DEBUG nova.virt.hardware [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Flavor pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 678.524319] env[60164]: DEBUG nova.virt.hardware [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Image pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 678.524319] env[60164]: DEBUG nova.virt.hardware [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 678.524319] env[60164]: DEBUG nova.virt.hardware [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 678.524319] env[60164]: DEBUG nova.virt.hardware [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 678.524443] env[60164]: DEBUG nova.virt.hardware [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Got 1 possible topologies {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 678.524443] env[60164]: DEBUG nova.virt.hardware [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 678.524443] env[60164]: DEBUG nova.virt.hardware [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 678.525127] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9935210a-b05d-42de-a3aa-54b9f0eece8b {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 678.534484] env[60164]: DEBUG oslo_concurrency.lockutils [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.236s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 678.534956] env[60164]: DEBUG nova.compute.manager [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] Start building networks asynchronously for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} [ 678.538192] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dbb1d5f4-9c2f-4fb2-b22c-a3bf9810f2a6 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 678.571143] env[60164]: DEBUG nova.compute.utils [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Using /dev/sd instead of None {{(pid=60164) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 678.572429] env[60164]: DEBUG nova.compute.manager [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] Not allocating networking since 'none' was specified. {{(pid=60164) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1953}} [ 678.586147] env[60164]: DEBUG nova.compute.manager [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] Start building block device mappings for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} [ 678.661968] env[60164]: DEBUG nova.compute.manager [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] Start spawning the instance on the hypervisor. {{(pid=60164) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} [ 678.686123] env[60164]: DEBUG nova.virt.hardware [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-15T19:01:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-15T19:01:11Z,direct_url=,disk_format='vmdk',id=1618eb55-f00d-42a5-b978-e81e57855fb4,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='4c19b57a5b764a6480c1f6e2a9cdd5e4',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-15T19:01:12Z,virtual_size=,visibility=), allow threads: False {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 678.686123] env[60164]: DEBUG nova.virt.hardware [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Flavor limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 678.686123] env[60164]: DEBUG nova.virt.hardware [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Image limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 678.686267] env[60164]: DEBUG nova.virt.hardware [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Flavor pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 678.686267] env[60164]: DEBUG nova.virt.hardware [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Image pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 678.686267] env[60164]: DEBUG nova.virt.hardware [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 678.686752] env[60164]: DEBUG nova.virt.hardware [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 678.687146] env[60164]: DEBUG nova.virt.hardware [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 678.687466] env[60164]: DEBUG nova.virt.hardware [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Got 1 possible topologies {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 678.687724] env[60164]: DEBUG nova.virt.hardware [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 678.687987] env[60164]: DEBUG nova.virt.hardware [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 678.688916] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31df4f62-74d3-4c09-843e-4e8999b1bd39 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 678.701038] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ec57ae2-2dbe-4c82-aab8-5cafba545a8c {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 678.718021] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] Instance VIF info [] {{(pid=60164) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 678.731282] env[60164]: DEBUG nova.virt.vmwareapi.vm_util [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Creating folder: OpenStack. Parent ref: group-v4. {{(pid=60164) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 678.732353] env[60164]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9aed729a-7d0b-4969-8749-97f636a9878c {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 678.808989] env[60164]: WARNING suds.client [-] Web service reported a SOAP processing fault using an unexpected HTTP status code 200. Reporting as an internal server error. [ 678.808989] env[60164]: DEBUG oslo_vmware.api [-] Fault list: [DuplicateName] {{(pid=60164) _invoke_api /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:337}} [ 678.808989] env[60164]: DEBUG nova.virt.vmwareapi.vm_util [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Folder already exists: OpenStack. Parent ref: group-v4. {{(pid=60164) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1599}} [ 678.808989] env[60164]: DEBUG nova.virt.vmwareapi.vm_util [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Creating folder: Project (af6d8fe22e704a3aa476242b6f6ef896). Parent ref: group-v277790. {{(pid=60164) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 678.808989] env[60164]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b23f1525-aff8-4104-88ca-af2cb689742d {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 678.819263] env[60164]: INFO nova.virt.vmwareapi.vm_util [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Created folder: Project (af6d8fe22e704a3aa476242b6f6ef896) in parent group-v277790. [ 678.819263] env[60164]: DEBUG nova.virt.vmwareapi.vm_util [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Creating folder: Instances. Parent ref: group-v277795. {{(pid=60164) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 678.819263] env[60164]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b3395449-91d8-4802-8615-a9a88a809e9b {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 678.828183] env[60164]: INFO nova.virt.vmwareapi.vm_util [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Created folder: Instances in parent group-v277795. [ 678.828394] env[60164]: DEBUG oslo.service.loopingcall [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60164) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 678.828617] env[60164]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] Creating VM on the ESX host {{(pid=60164) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 678.828823] env[60164]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b93524c9-d85c-416f-946a-1a7ef65c3fda {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 678.847494] env[60164]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 678.847494] env[60164]: value = "task-1295439" [ 678.847494] env[60164]: _type = "Task" [ 678.847494] env[60164]: } to complete. {{(pid=60164) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 678.856702] env[60164]: DEBUG oslo_vmware.api [-] Task: {'id': task-1295439, 'name': CreateVM_Task} progress is 0%. {{(pid=60164) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 678.988995] env[60164]: DEBUG nova.policy [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f7221f06a45d45f2a34ab3bdd869113d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8d75debce2fd4b2492cc02aeb2fed7fc', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60164) authorize /opt/stack/nova/nova/policy.py:203}} [ 679.359171] env[60164]: DEBUG oslo_vmware.api [-] Task: {'id': task-1295439, 'name': CreateVM_Task, 'duration_secs': 0.24404} completed successfully. {{(pid=60164) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 679.359370] env[60164]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] Created VM on the ESX host {{(pid=60164) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 679.360648] env[60164]: DEBUG oslo_vmware.service [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8257c14f-a39c-43e9-ac96-98dfcc1605fd {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 679.366789] env[60164]: DEBUG oslo_concurrency.lockutils [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/1618eb55-f00d-42a5-b978-e81e57855fb4" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 679.366956] env[60164]: DEBUG oslo_concurrency.lockutils [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Acquired lock "[datastore1] devstack-image-cache_base/1618eb55-f00d-42a5-b978-e81e57855fb4" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 679.368011] env[60164]: DEBUG oslo_concurrency.lockutils [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/1618eb55-f00d-42a5-b978-e81e57855fb4" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 679.368254] env[60164]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6f915cb1-16f5-4348-a010-b57acb63abfb {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 679.373738] env[60164]: DEBUG oslo_vmware.api [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Waiting for the task: (returnval){ [ 679.373738] env[60164]: value = "session[528ca5dc-e009-fd53-4682-e6b571cb4de5]521f582e-f7b5-0002-e27d-a0fae925fe13" [ 679.373738] env[60164]: _type = "Task" [ 679.373738] env[60164]: } to complete. {{(pid=60164) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 679.383464] env[60164]: DEBUG oslo_vmware.api [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Task: {'id': session[528ca5dc-e009-fd53-4682-e6b571cb4de5]521f582e-f7b5-0002-e27d-a0fae925fe13, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60164) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 679.885356] env[60164]: DEBUG oslo_concurrency.lockutils [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Releasing lock "[datastore1] devstack-image-cache_base/1618eb55-f00d-42a5-b978-e81e57855fb4" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 679.885613] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] Processing image 1618eb55-f00d-42a5-b978-e81e57855fb4 {{(pid=60164) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 679.885844] env[60164]: DEBUG oslo_concurrency.lockutils [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/1618eb55-f00d-42a5-b978-e81e57855fb4/1618eb55-f00d-42a5-b978-e81e57855fb4.vmdk" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 679.885985] env[60164]: DEBUG oslo_concurrency.lockutils [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Acquired lock "[datastore1] devstack-image-cache_base/1618eb55-f00d-42a5-b978-e81e57855fb4/1618eb55-f00d-42a5-b978-e81e57855fb4.vmdk" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 679.886416] env[60164]: DEBUG nova.virt.vmwareapi.ds_util [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60164) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 679.886663] env[60164]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f51b72d5-6a32-4494-92cc-62f403e87ffe {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 679.904429] env[60164]: DEBUG nova.virt.vmwareapi.ds_util [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60164) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 679.904616] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60164) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 679.905421] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c46eb791-85cf-4626-a66f-776dd339de93 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 679.913080] env[60164]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0f4f0a97-cce8-40d9-8f0b-241eb1021cfb {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 679.918614] env[60164]: DEBUG oslo_vmware.api [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Waiting for the task: (returnval){ [ 679.918614] env[60164]: value = "session[528ca5dc-e009-fd53-4682-e6b571cb4de5]524c7e5b-534c-0f00-f918-1703d6fdfd86" [ 679.918614] env[60164]: _type = "Task" [ 679.918614] env[60164]: } to complete. {{(pid=60164) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 679.932024] env[60164]: DEBUG oslo_vmware.api [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Task: {'id': session[528ca5dc-e009-fd53-4682-e6b571cb4de5]524c7e5b-534c-0f00-f918-1703d6fdfd86, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60164) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 680.408928] env[60164]: DEBUG oslo_concurrency.lockutils [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Acquiring lock "68545276-63f2-4baf-8110-d3cc71686682" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 680.409228] env[60164]: DEBUG oslo_concurrency.lockutils [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Lock "68545276-63f2-4baf-8110-d3cc71686682" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 680.424034] env[60164]: DEBUG nova.compute.manager [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] [instance: 68545276-63f2-4baf-8110-d3cc71686682] Starting instance... {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} [ 680.435547] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] Preparing fetch location {{(pid=60164) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 680.435808] env[60164]: DEBUG nova.virt.vmwareapi.ds_util [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Creating directory with path [datastore1] vmware_temp/f37647a9-666c-4009-a4b2-3aa09632a939/1618eb55-f00d-42a5-b978-e81e57855fb4 {{(pid=60164) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 680.436106] env[60164]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-522a49a2-3a8f-4679-987c-57a9aee6ea6a {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.502920] env[60164]: DEBUG oslo_concurrency.lockutils [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 680.502920] env[60164]: DEBUG oslo_concurrency.lockutils [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 680.505540] env[60164]: INFO nova.compute.claims [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] [instance: 68545276-63f2-4baf-8110-d3cc71686682] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 680.553373] env[60164]: DEBUG oslo_concurrency.lockutils [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Acquiring lock "b1361aa5-9bbd-4891-b74f-a0afd90b0bd6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 680.553586] env[60164]: DEBUG oslo_concurrency.lockutils [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Lock "b1361aa5-9bbd-4891-b74f-a0afd90b0bd6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 680.572695] env[60164]: DEBUG nova.compute.manager [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] Starting instance... {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} [ 680.658908] env[60164]: DEBUG oslo_concurrency.lockutils [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 680.672249] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae22b3ff-9210-4e40-9133-07ff1b3f7fea {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.681406] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66ebbb9f-0ed8-4259-96c1-3997db28dd93 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.712231] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f2a248b-17f3-4e1e-ace0-8f529a25c102 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.720254] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94d38a63-c06d-4d8b-9a37-a116a15a344c {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.735415] env[60164]: DEBUG nova.compute.provider_tree [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 680.747981] env[60164]: DEBUG nova.scheduler.client.report [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 680.775018] env[60164]: DEBUG oslo_concurrency.lockutils [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.270s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 680.775018] env[60164]: DEBUG nova.compute.manager [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] [instance: 68545276-63f2-4baf-8110-d3cc71686682] Start building networks asynchronously for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} [ 680.775430] env[60164]: DEBUG oslo_concurrency.lockutils [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.117s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 680.777201] env[60164]: INFO nova.compute.claims [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 680.807403] env[60164]: DEBUG nova.compute.utils [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Using /dev/sd instead of None {{(pid=60164) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 680.813883] env[60164]: DEBUG nova.compute.manager [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] [instance: 68545276-63f2-4baf-8110-d3cc71686682] Not allocating networking since 'none' was specified. {{(pid=60164) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1953}} [ 680.821362] env[60164]: DEBUG nova.compute.manager [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] [instance: 68545276-63f2-4baf-8110-d3cc71686682] Start building block device mappings for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} [ 680.902708] env[60164]: DEBUG nova.compute.manager [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] [instance: 68545276-63f2-4baf-8110-d3cc71686682] Start spawning the instance on the hypervisor. {{(pid=60164) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} [ 680.921032] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52fb4f01-c3a8-45a8-b53e-9989d3ec018b {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.934182] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-594bbc0a-c6f1-4a65-b055-4fa354758dbb {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.942028] env[60164]: DEBUG nova.virt.hardware [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-15T19:01:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-15T19:01:11Z,direct_url=,disk_format='vmdk',id=1618eb55-f00d-42a5-b978-e81e57855fb4,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='4c19b57a5b764a6480c1f6e2a9cdd5e4',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-15T19:01:12Z,virtual_size=,visibility=), allow threads: False {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 680.942272] env[60164]: DEBUG nova.virt.hardware [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Flavor limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 680.942464] env[60164]: DEBUG nova.virt.hardware [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Image limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 680.942648] env[60164]: DEBUG nova.virt.hardware [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Flavor pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 680.942787] env[60164]: DEBUG nova.virt.hardware [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Image pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 680.942927] env[60164]: DEBUG nova.virt.hardware [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 680.943222] env[60164]: DEBUG nova.virt.hardware [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 680.943391] env[60164]: DEBUG nova.virt.hardware [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 680.943559] env[60164]: DEBUG nova.virt.hardware [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Got 1 possible topologies {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 680.943716] env[60164]: DEBUG nova.virt.hardware [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 680.943886] env[60164]: DEBUG nova.virt.hardware [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 680.944949] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b96e19ef-27ed-4e2c-bd98-1a0203ec79db {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.983665] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65f97bf4-fe89-4f98-bd87-196321e15b70 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.994383] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37e8d679-161a-41ff-adce-346891eea979 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 681.000368] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57ab7bd2-0d39-470e-9c46-801d3ca04269 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 681.025617] env[60164]: DEBUG nova.compute.provider_tree [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 681.027074] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] [instance: 68545276-63f2-4baf-8110-d3cc71686682] Instance VIF info [] {{(pid=60164) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 681.032951] env[60164]: DEBUG nova.virt.vmwareapi.vm_util [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Creating folder: Project (39d93795676543a287744d7fde7dde5c). Parent ref: group-v277790. {{(pid=60164) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 681.033740] env[60164]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f4356c6d-b41d-4d8d-9939-f4cbaf925aaf {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 681.037030] env[60164]: DEBUG nova.scheduler.client.report [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 681.044132] env[60164]: INFO nova.virt.vmwareapi.vm_util [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Created folder: Project (39d93795676543a287744d7fde7dde5c) in parent group-v277790. [ 681.044345] env[60164]: DEBUG nova.virt.vmwareapi.vm_util [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Creating folder: Instances. Parent ref: group-v277798. {{(pid=60164) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 681.044565] env[60164]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-83eb3284-d52c-4235-b654-80b5273ab87e {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 681.053488] env[60164]: INFO nova.virt.vmwareapi.vm_util [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Created folder: Instances in parent group-v277798. [ 681.053724] env[60164]: DEBUG oslo.service.loopingcall [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60164) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 681.054420] env[60164]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 68545276-63f2-4baf-8110-d3cc71686682] Creating VM on the ESX host {{(pid=60164) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 681.055125] env[60164]: DEBUG oslo_concurrency.lockutils [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.280s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 681.055580] env[60164]: DEBUG nova.compute.manager [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] Start building networks asynchronously for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} [ 681.058330] env[60164]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-71f1bf4e-4010-4629-ab4e-c8948da18b98 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 681.076837] env[60164]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 681.076837] env[60164]: value = "task-1295442" [ 681.076837] env[60164]: _type = "Task" [ 681.076837] env[60164]: } to complete. {{(pid=60164) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 681.083668] env[60164]: DEBUG oslo_vmware.api [-] Task: {'id': task-1295442, 'name': CreateVM_Task} progress is 0%. {{(pid=60164) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 681.099092] env[60164]: DEBUG nova.compute.utils [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Using /dev/sd instead of None {{(pid=60164) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 681.100547] env[60164]: DEBUG nova.compute.manager [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] Not allocating networking since 'none' was specified. {{(pid=60164) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1953}} [ 681.113765] env[60164]: DEBUG nova.compute.manager [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] Start building block device mappings for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} [ 681.189525] env[60164]: DEBUG nova.compute.manager [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] Start spawning the instance on the hypervisor. {{(pid=60164) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} [ 681.226220] env[60164]: DEBUG nova.virt.hardware [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-15T19:01:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-15T19:01:11Z,direct_url=,disk_format='vmdk',id=1618eb55-f00d-42a5-b978-e81e57855fb4,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='4c19b57a5b764a6480c1f6e2a9cdd5e4',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-15T19:01:12Z,virtual_size=,visibility=), allow threads: False {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 681.226469] env[60164]: DEBUG nova.virt.hardware [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Flavor limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 681.226623] env[60164]: DEBUG nova.virt.hardware [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Image limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 681.226799] env[60164]: DEBUG nova.virt.hardware [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Flavor pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 681.226941] env[60164]: DEBUG nova.virt.hardware [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Image pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 681.227094] env[60164]: DEBUG nova.virt.hardware [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 681.227349] env[60164]: DEBUG nova.virt.hardware [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 681.227538] env[60164]: DEBUG nova.virt.hardware [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 681.227705] env[60164]: DEBUG nova.virt.hardware [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Got 1 possible topologies {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 681.227864] env[60164]: DEBUG nova.virt.hardware [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 681.228047] env[60164]: DEBUG nova.virt.hardware [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 681.228894] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf33bd93-0c0c-4dc8-997a-57037a3e8db7 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 681.237796] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-883a7bfc-bbce-49bf-ad61-d1159b01b69f {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 681.253042] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] Instance VIF info [] {{(pid=60164) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 681.263467] env[60164]: DEBUG oslo.service.loopingcall [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60164) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 681.263467] env[60164]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] Creating VM on the ESX host {{(pid=60164) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 681.263467] env[60164]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-31d54094-b87b-437c-9ac4-64e15b7f7d17 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 681.281235] env[60164]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 681.281235] env[60164]: value = "task-1295443" [ 681.281235] env[60164]: _type = "Task" [ 681.281235] env[60164]: } to complete. {{(pid=60164) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 681.288085] env[60164]: DEBUG oslo_vmware.api [-] Task: {'id': task-1295443, 'name': CreateVM_Task} progress is 0%. {{(pid=60164) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 681.588546] env[60164]: DEBUG oslo_vmware.api [-] Task: {'id': task-1295442, 'name': CreateVM_Task, 'duration_secs': 0.277028} completed successfully. {{(pid=60164) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 681.588710] env[60164]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 68545276-63f2-4baf-8110-d3cc71686682] Created VM on the ESX host {{(pid=60164) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 681.589133] env[60164]: DEBUG oslo_concurrency.lockutils [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/1618eb55-f00d-42a5-b978-e81e57855fb4" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 681.589287] env[60164]: DEBUG oslo_concurrency.lockutils [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Acquired lock "[datastore1] devstack-image-cache_base/1618eb55-f00d-42a5-b978-e81e57855fb4" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 681.589682] env[60164]: DEBUG oslo_concurrency.lockutils [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/1618eb55-f00d-42a5-b978-e81e57855fb4" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 681.589921] env[60164]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4f6e91f2-6835-4cde-8a49-48736e57f886 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 681.595549] env[60164]: DEBUG oslo_vmware.api [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Waiting for the task: (returnval){ [ 681.595549] env[60164]: value = "session[528ca5dc-e009-fd53-4682-e6b571cb4de5]5236265a-bc0a-b181-87c1-2a8c4134dc4d" [ 681.595549] env[60164]: _type = "Task" [ 681.595549] env[60164]: } to complete. {{(pid=60164) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 681.605105] env[60164]: DEBUG oslo_vmware.api [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Task: {'id': session[528ca5dc-e009-fd53-4682-e6b571cb4de5]5236265a-bc0a-b181-87c1-2a8c4134dc4d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60164) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 681.792127] env[60164]: DEBUG oslo_vmware.api [-] Task: {'id': task-1295443, 'name': CreateVM_Task} progress is 6%. {{(pid=60164) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 681.898018] env[60164]: DEBUG nova.virt.vmwareapi.ds_util [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Created directory with path [datastore1] vmware_temp/f37647a9-666c-4009-a4b2-3aa09632a939/1618eb55-f00d-42a5-b978-e81e57855fb4 {{(pid=60164) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 681.898018] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] Fetch image to [datastore1] vmware_temp/f37647a9-666c-4009-a4b2-3aa09632a939/1618eb55-f00d-42a5-b978-e81e57855fb4/tmp-sparse.vmdk {{(pid=60164) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 681.898018] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] Downloading image file data 1618eb55-f00d-42a5-b978-e81e57855fb4 to [datastore1] vmware_temp/f37647a9-666c-4009-a4b2-3aa09632a939/1618eb55-f00d-42a5-b978-e81e57855fb4/tmp-sparse.vmdk on the data store datastore1 {{(pid=60164) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 681.898018] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0b25103-8798-4842-a499-7e85ebc75fc5 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 681.908788] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c2dd214-6b97-4908-ae27-3576bd215795 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 681.924042] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81b03e18-768f-418f-9a21-31b1098b7941 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 681.965515] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7991e411-31ba-4f1b-b60a-c6c641fd74ff {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 681.971305] env[60164]: DEBUG nova.network.neutron [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] Successfully created port: 916b0b39-e552-4059-bbaf-20e8e06f1998 {{(pid=60164) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 681.976306] env[60164]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-a991fc9e-3421-4b32-a0f2-bd171946b61e {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 682.064325] env[60164]: DEBUG nova.virt.vmwareapi.images [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] Downloading image file data 1618eb55-f00d-42a5-b978-e81e57855fb4 to the data store datastore1 {{(pid=60164) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 682.106705] env[60164]: DEBUG oslo_concurrency.lockutils [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Releasing lock "[datastore1] devstack-image-cache_base/1618eb55-f00d-42a5-b978-e81e57855fb4" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 682.107451] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] [instance: 68545276-63f2-4baf-8110-d3cc71686682] Processing image 1618eb55-f00d-42a5-b978-e81e57855fb4 {{(pid=60164) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 682.107748] env[60164]: DEBUG oslo_concurrency.lockutils [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/1618eb55-f00d-42a5-b978-e81e57855fb4/1618eb55-f00d-42a5-b978-e81e57855fb4.vmdk" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 682.134833] env[60164]: DEBUG oslo_vmware.rw_handles [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f37647a9-666c-4009-a4b2-3aa09632a939/1618eb55-f00d-42a5-b978-e81e57855fb4/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60164) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 682.245381] env[60164]: DEBUG oslo_vmware.rw_handles [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Completed reading data from the image iterator. {{(pid=60164) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 682.245381] env[60164]: DEBUG oslo_vmware.rw_handles [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f37647a9-666c-4009-a4b2-3aa09632a939/1618eb55-f00d-42a5-b978-e81e57855fb4/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60164) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 682.294681] env[60164]: DEBUG oslo_vmware.api [-] Task: {'id': task-1295443, 'name': CreateVM_Task, 'duration_secs': 0.84393} completed successfully. {{(pid=60164) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 682.294848] env[60164]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] Created VM on the ESX host {{(pid=60164) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 682.295273] env[60164]: DEBUG oslo_concurrency.lockutils [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/1618eb55-f00d-42a5-b978-e81e57855fb4" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 682.295505] env[60164]: DEBUG oslo_concurrency.lockutils [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Acquired lock "[datastore1] devstack-image-cache_base/1618eb55-f00d-42a5-b978-e81e57855fb4" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 682.295860] env[60164]: DEBUG oslo_concurrency.lockutils [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/1618eb55-f00d-42a5-b978-e81e57855fb4" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 682.296196] env[60164]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-15dae8a3-8d4a-41a4-91b1-43e25aefdbe6 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 682.301466] env[60164]: DEBUG oslo_vmware.api [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Waiting for the task: (returnval){ [ 682.301466] env[60164]: value = "session[528ca5dc-e009-fd53-4682-e6b571cb4de5]52ae9e99-d2c9-e226-0a5d-cad4742abae2" [ 682.301466] env[60164]: _type = "Task" [ 682.301466] env[60164]: } to complete. {{(pid=60164) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 682.313287] env[60164]: DEBUG oslo_vmware.api [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Task: {'id': session[528ca5dc-e009-fd53-4682-e6b571cb4de5]52ae9e99-d2c9-e226-0a5d-cad4742abae2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60164) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 682.814447] env[60164]: DEBUG oslo_concurrency.lockutils [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Releasing lock "[datastore1] devstack-image-cache_base/1618eb55-f00d-42a5-b978-e81e57855fb4" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 682.814697] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] Processing image 1618eb55-f00d-42a5-b978-e81e57855fb4 {{(pid=60164) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 682.814906] env[60164]: DEBUG oslo_concurrency.lockutils [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/1618eb55-f00d-42a5-b978-e81e57855fb4/1618eb55-f00d-42a5-b978-e81e57855fb4.vmdk" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 683.257707] env[60164]: DEBUG oslo_concurrency.lockutils [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Acquiring lock "4c545ed0-7442-43db-a96f-4d7f1b785c4d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 683.258274] env[60164]: DEBUG oslo_concurrency.lockutils [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Lock "4c545ed0-7442-43db-a96f-4d7f1b785c4d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 683.274888] env[60164]: DEBUG nova.compute.manager [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] Starting instance... {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} [ 683.363723] env[60164]: DEBUG oslo_concurrency.lockutils [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 683.363723] env[60164]: DEBUG oslo_concurrency.lockutils [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 683.364653] env[60164]: INFO nova.compute.claims [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 683.554607] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ec99437-646d-4045-8871-aeccf4c79863 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 683.566057] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50588d12-a48c-4524-8b22-ed0ac58ed6a1 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 683.597743] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b8139bc-5a78-4177-a894-c82aa9593e24 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 683.608160] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b2cabf2-822d-4024-acb9-c9058c49e3d4 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 683.619981] env[60164]: DEBUG nova.compute.provider_tree [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 683.633252] env[60164]: DEBUG nova.scheduler.client.report [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 683.651738] env[60164]: DEBUG oslo_concurrency.lockutils [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.289s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 683.651738] env[60164]: DEBUG nova.compute.manager [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] Start building networks asynchronously for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} [ 683.690516] env[60164]: DEBUG nova.compute.utils [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Using /dev/sd instead of None {{(pid=60164) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 683.691793] env[60164]: DEBUG nova.compute.manager [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] Allocating IP information in the background. {{(pid=60164) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 683.691947] env[60164]: DEBUG nova.network.neutron [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] allocate_for_instance() {{(pid=60164) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1143}} [ 683.710014] env[60164]: DEBUG nova.compute.manager [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] Start building block device mappings for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} [ 683.784927] env[60164]: DEBUG nova.compute.manager [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] Start spawning the instance on the hypervisor. {{(pid=60164) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} [ 683.811164] env[60164]: DEBUG nova.virt.hardware [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-15T19:01:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-15T19:01:11Z,direct_url=,disk_format='vmdk',id=1618eb55-f00d-42a5-b978-e81e57855fb4,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='4c19b57a5b764a6480c1f6e2a9cdd5e4',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-15T19:01:12Z,virtual_size=,visibility=), allow threads: False {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 683.811423] env[60164]: DEBUG nova.virt.hardware [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Flavor limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 683.811676] env[60164]: DEBUG nova.virt.hardware [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Image limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 683.811880] env[60164]: DEBUG nova.virt.hardware [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Flavor pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 683.812033] env[60164]: DEBUG nova.virt.hardware [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Image pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 683.812225] env[60164]: DEBUG nova.virt.hardware [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 683.812447] env[60164]: DEBUG nova.virt.hardware [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 683.812600] env[60164]: DEBUG nova.virt.hardware [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 683.812761] env[60164]: DEBUG nova.virt.hardware [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Got 1 possible topologies {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 683.812921] env[60164]: DEBUG nova.virt.hardware [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 683.813131] env[60164]: DEBUG nova.virt.hardware [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 683.813988] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e623d65a-e665-46ac-9641-6dcfdb903421 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 683.822894] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0444f227-9e96-4a25-b017-1cc788fb72e6 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 684.219956] env[60164]: DEBUG nova.policy [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '56bb638542d440639e1a38b10e80fb1e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0be6718d0cbe4351a06b59576311c7f8', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60164) authorize /opt/stack/nova/nova/policy.py:203}} [ 685.225375] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Acquiring lock "6c8194c3-68fd-4ffc-a0fa-f23c8935bee6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 685.227017] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Lock "6c8194c3-68fd-4ffc-a0fa-f23c8935bee6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 685.238535] env[60164]: DEBUG nova.compute.manager [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] Starting instance... {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} [ 685.291160] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 685.291160] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 685.291160] env[60164]: INFO nova.compute.claims [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 685.456417] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97a9ac27-2b5b-41c5-b4bd-e5eff4bf28f0 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 685.465498] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dea07460-e477-49e0-857f-3d2b21cf8b0c {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 685.500939] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-012f80a4-ea46-402a-bcf7-a5d3c9c261e3 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 685.509067] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48732f60-84f3-4e36-bb6a-72f5af3b71a8 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 685.523708] env[60164]: DEBUG nova.compute.provider_tree [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 685.533210] env[60164]: DEBUG nova.scheduler.client.report [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 685.550283] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.260s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 685.550283] env[60164]: DEBUG nova.compute.manager [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] Start building networks asynchronously for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} [ 685.588089] env[60164]: DEBUG nova.compute.utils [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Using /dev/sd instead of None {{(pid=60164) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 685.590552] env[60164]: DEBUG nova.compute.manager [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] Not allocating networking since 'none' was specified. {{(pid=60164) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1953}} [ 685.605612] env[60164]: DEBUG nova.compute.manager [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] Start building block device mappings for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} [ 685.696024] env[60164]: DEBUG nova.compute.manager [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] Start spawning the instance on the hypervisor. {{(pid=60164) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} [ 685.724471] env[60164]: DEBUG nova.virt.hardware [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-15T19:01:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-15T19:01:11Z,direct_url=,disk_format='vmdk',id=1618eb55-f00d-42a5-b978-e81e57855fb4,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='4c19b57a5b764a6480c1f6e2a9cdd5e4',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-15T19:01:12Z,virtual_size=,visibility=), allow threads: False {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 685.724836] env[60164]: DEBUG nova.virt.hardware [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Flavor limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 685.724836] env[60164]: DEBUG nova.virt.hardware [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Image limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 685.725018] env[60164]: DEBUG nova.virt.hardware [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Flavor pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 685.725180] env[60164]: DEBUG nova.virt.hardware [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Image pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 685.725327] env[60164]: DEBUG nova.virt.hardware [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 685.725526] env[60164]: DEBUG nova.virt.hardware [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 685.725952] env[60164]: DEBUG nova.virt.hardware [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 685.726653] env[60164]: DEBUG nova.virt.hardware [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Got 1 possible topologies {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 685.726850] env[60164]: DEBUG nova.virt.hardware [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 685.727107] env[60164]: DEBUG nova.virt.hardware [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 685.728047] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b66a2c6-6e6c-419a-b564-0e1279b04e22 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 685.738459] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd6f7aee-2d65-4139-93d7-3f15a586f4b0 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 685.755022] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] Instance VIF info [] {{(pid=60164) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 685.762498] env[60164]: DEBUG nova.virt.vmwareapi.vm_util [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Creating folder: Project (a0e821828b384a40a78522e538c9dcdf). Parent ref: group-v277790. {{(pid=60164) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 685.763150] env[60164]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0c68e1f3-02ce-4562-8e8a-39968ea4233b {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 685.778792] env[60164]: INFO nova.virt.vmwareapi.vm_util [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Created folder: Project (a0e821828b384a40a78522e538c9dcdf) in parent group-v277790. [ 685.779057] env[60164]: DEBUG nova.virt.vmwareapi.vm_util [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Creating folder: Instances. Parent ref: group-v277802. {{(pid=60164) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 685.779372] env[60164]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-42100656-6871-487e-83cf-e3e1dc91660d {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 685.789730] env[60164]: INFO nova.virt.vmwareapi.vm_util [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Created folder: Instances in parent group-v277802. [ 685.790073] env[60164]: DEBUG oslo.service.loopingcall [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60164) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 685.790326] env[60164]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] Creating VM on the ESX host {{(pid=60164) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 685.790586] env[60164]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-8eab19ef-b4e5-47e9-8daf-901b5fedfe04 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 685.808465] env[60164]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 685.808465] env[60164]: value = "task-1295446" [ 685.808465] env[60164]: _type = "Task" [ 685.808465] env[60164]: } to complete. {{(pid=60164) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 685.817719] env[60164]: DEBUG oslo_vmware.api [-] Task: {'id': task-1295446, 'name': CreateVM_Task} progress is 0%. {{(pid=60164) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 686.320752] env[60164]: DEBUG oslo_vmware.api [-] Task: {'id': task-1295446, 'name': CreateVM_Task, 'duration_secs': 0.255335} completed successfully. {{(pid=60164) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 686.320752] env[60164]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] Created VM on the ESX host {{(pid=60164) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 686.321048] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/1618eb55-f00d-42a5-b978-e81e57855fb4" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 686.321253] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Acquired lock "[datastore1] devstack-image-cache_base/1618eb55-f00d-42a5-b978-e81e57855fb4" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 686.321573] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/1618eb55-f00d-42a5-b978-e81e57855fb4" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 686.321998] env[60164]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6d9ab7cb-3645-471f-9966-915b1cb9de4c {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 686.328248] env[60164]: DEBUG oslo_vmware.api [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Waiting for the task: (returnval){ [ 686.328248] env[60164]: value = "session[528ca5dc-e009-fd53-4682-e6b571cb4de5]5249ffea-be9c-dfde-9444-1639194806c4" [ 686.328248] env[60164]: _type = "Task" [ 686.328248] env[60164]: } to complete. {{(pid=60164) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 686.336809] env[60164]: DEBUG oslo_vmware.api [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Task: {'id': session[528ca5dc-e009-fd53-4682-e6b571cb4de5]5249ffea-be9c-dfde-9444-1639194806c4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60164) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 686.838325] env[60164]: DEBUG oslo_vmware.api [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Task: {'id': session[528ca5dc-e009-fd53-4682-e6b571cb4de5]5249ffea-be9c-dfde-9444-1639194806c4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60164) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 687.340961] env[60164]: DEBUG oslo_vmware.api [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Task: {'id': session[528ca5dc-e009-fd53-4682-e6b571cb4de5]5249ffea-be9c-dfde-9444-1639194806c4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60164) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 687.502567] env[60164]: DEBUG nova.network.neutron [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] Successfully created port: 6ea2b6b0-f8d9-4d8e-8406-96a2d5c43db7 {{(pid=60164) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 687.841523] env[60164]: DEBUG oslo_vmware.api [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Task: {'id': session[528ca5dc-e009-fd53-4682-e6b571cb4de5]5249ffea-be9c-dfde-9444-1639194806c4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60164) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 688.341304] env[60164]: DEBUG oslo_vmware.api [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Task: {'id': session[528ca5dc-e009-fd53-4682-e6b571cb4de5]5249ffea-be9c-dfde-9444-1639194806c4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60164) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 688.843702] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Releasing lock "[datastore1] devstack-image-cache_base/1618eb55-f00d-42a5-b978-e81e57855fb4" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 688.843945] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] Processing image 1618eb55-f00d-42a5-b978-e81e57855fb4 {{(pid=60164) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 688.844166] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/1618eb55-f00d-42a5-b978-e81e57855fb4/1618eb55-f00d-42a5-b978-e81e57855fb4.vmdk" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 690.102246] env[60164]: DEBUG oslo_concurrency.lockutils [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] Acquiring lock "5980cfa0-bdd6-4fca-a605-c857e0e7b886" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 690.102543] env[60164]: DEBUG oslo_concurrency.lockutils [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] Lock "5980cfa0-bdd6-4fca-a605-c857e0e7b886" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 690.114461] env[60164]: DEBUG nova.compute.manager [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] Starting instance... {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} [ 690.175548] env[60164]: DEBUG oslo_concurrency.lockutils [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 690.175548] env[60164]: DEBUG oslo_concurrency.lockutils [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 690.177060] env[60164]: INFO nova.compute.claims [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 690.350023] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8cafbb4a-b4cb-4cfc-ad26-6f6bb3ff0130 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 690.359687] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d551d02-be8c-4732-a4a4-f3ad2c7472b2 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 690.394813] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-261df402-14fc-42a5-b22b-cc8f66f862f1 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 690.402572] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3abecc13-3ec0-4cf6-83ac-f73ad989c230 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 690.417693] env[60164]: DEBUG nova.compute.provider_tree [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 690.429019] env[60164]: DEBUG nova.scheduler.client.report [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 690.444965] env[60164]: DEBUG oslo_concurrency.lockutils [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.269s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 690.445596] env[60164]: DEBUG nova.compute.manager [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] Start building networks asynchronously for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} [ 690.488624] env[60164]: DEBUG nova.compute.utils [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] Using /dev/sd instead of None {{(pid=60164) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 690.488624] env[60164]: DEBUG nova.compute.manager [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] Allocating IP information in the background. {{(pid=60164) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 690.489587] env[60164]: DEBUG nova.network.neutron [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] allocate_for_instance() {{(pid=60164) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1143}} [ 690.504915] env[60164]: DEBUG nova.compute.manager [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] Start building block device mappings for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} [ 690.580077] env[60164]: DEBUG nova.compute.manager [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] Start spawning the instance on the hypervisor. {{(pid=60164) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} [ 690.609204] env[60164]: DEBUG nova.virt.hardware [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-15T19:01:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-15T19:01:11Z,direct_url=,disk_format='vmdk',id=1618eb55-f00d-42a5-b978-e81e57855fb4,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='4c19b57a5b764a6480c1f6e2a9cdd5e4',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-15T19:01:12Z,virtual_size=,visibility=), allow threads: False {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 690.609204] env[60164]: DEBUG nova.virt.hardware [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] Flavor limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 690.609204] env[60164]: DEBUG nova.virt.hardware [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] Image limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 690.609521] env[60164]: DEBUG nova.virt.hardware [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] Flavor pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 690.609630] env[60164]: DEBUG nova.virt.hardware [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] Image pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 690.609770] env[60164]: DEBUG nova.virt.hardware [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 690.609964] env[60164]: DEBUG nova.virt.hardware [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 690.610703] env[60164]: DEBUG nova.virt.hardware [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 690.610703] env[60164]: DEBUG nova.virt.hardware [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] Got 1 possible topologies {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 690.610703] env[60164]: DEBUG nova.virt.hardware [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 690.610703] env[60164]: DEBUG nova.virt.hardware [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 690.611540] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e60f7151-e668-44b3-b095-63bd0f7385bf {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 690.621623] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03462c8d-1503-469a-82fe-31c8e0985dad {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 690.886998] env[60164]: DEBUG nova.policy [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5c9431bd76694bbab69bffb93531532f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '39a67a0257094a479656d0975c7f2127', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60164) authorize /opt/stack/nova/nova/policy.py:203}} [ 691.413617] env[60164]: DEBUG oslo_concurrency.lockutils [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] Acquiring lock "5ada08c2-ea12-4b16-9384-af545c8e06aa" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 691.413961] env[60164]: DEBUG oslo_concurrency.lockutils [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] Lock "5ada08c2-ea12-4b16-9384-af545c8e06aa" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 691.426411] env[60164]: DEBUG nova.compute.manager [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] Starting instance... {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} [ 691.474983] env[60164]: DEBUG oslo_concurrency.lockutils [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 691.475272] env[60164]: DEBUG oslo_concurrency.lockutils [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 691.476779] env[60164]: INFO nova.compute.claims [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 691.677996] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1eaf86f-26b0-4792-82a4-0ec6f93feba8 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 691.686929] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18402c1d-fdad-4335-9ea9-7d175bb9e258 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 691.721151] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4ba56d8-6cf6-44d4-a7c4-14ae0e9e1c4a {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 691.729660] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5195ced-e968-4c39-b7a3-21ab8a09e2fb {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 691.744614] env[60164]: DEBUG nova.compute.provider_tree [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 691.753472] env[60164]: DEBUG nova.scheduler.client.report [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 691.772189] env[60164]: DEBUG oslo_concurrency.lockutils [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.297s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 691.772703] env[60164]: DEBUG nova.compute.manager [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] Start building networks asynchronously for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} [ 691.812860] env[60164]: DEBUG nova.compute.utils [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] Using /dev/sd instead of None {{(pid=60164) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 691.814197] env[60164]: DEBUG nova.compute.manager [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] Allocating IP information in the background. {{(pid=60164) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 691.814363] env[60164]: DEBUG nova.network.neutron [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] allocate_for_instance() {{(pid=60164) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1143}} [ 691.823167] env[60164]: DEBUG nova.compute.manager [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] Start building block device mappings for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} [ 691.912941] env[60164]: DEBUG nova.compute.manager [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] Start spawning the instance on the hypervisor. {{(pid=60164) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} [ 691.944653] env[60164]: DEBUG nova.virt.hardware [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-15T19:01:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-15T19:01:11Z,direct_url=,disk_format='vmdk',id=1618eb55-f00d-42a5-b978-e81e57855fb4,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='4c19b57a5b764a6480c1f6e2a9cdd5e4',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-15T19:01:12Z,virtual_size=,visibility=), allow threads: False {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 691.944975] env[60164]: DEBUG nova.virt.hardware [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] Flavor limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 691.945081] env[60164]: DEBUG nova.virt.hardware [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] Image limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 691.945265] env[60164]: DEBUG nova.virt.hardware [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] Flavor pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 691.945443] env[60164]: DEBUG nova.virt.hardware [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] Image pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 691.945597] env[60164]: DEBUG nova.virt.hardware [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 691.945801] env[60164]: DEBUG nova.virt.hardware [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 691.945954] env[60164]: DEBUG nova.virt.hardware [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 691.946829] env[60164]: DEBUG nova.virt.hardware [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] Got 1 possible topologies {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 691.946829] env[60164]: DEBUG nova.virt.hardware [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 691.946829] env[60164]: DEBUG nova.virt.hardware [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 691.947671] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66e3c026-c39d-4740-aec7-5a2fc1f5f4f3 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 691.957560] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8fe5a68a-a88f-434d-92fc-f5b3a7cba5c3 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 692.178345] env[60164]: DEBUG nova.policy [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f14d545d067e4b77b659505dae36811d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1d1843cb3adc4fe59efbc5dd6a7c0f32', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60164) authorize /opt/stack/nova/nova/policy.py:203}} [ 692.490875] env[60164]: ERROR nova.compute.manager [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 916b0b39-e552-4059-bbaf-20e8e06f1998, please check neutron logs for more information. [ 692.490875] env[60164]: ERROR nova.compute.manager Traceback (most recent call last): [ 692.490875] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 692.490875] env[60164]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 692.490875] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 692.490875] env[60164]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 692.490875] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 692.490875] env[60164]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 692.490875] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 692.490875] env[60164]: ERROR nova.compute.manager self.force_reraise() [ 692.490875] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 692.490875] env[60164]: ERROR nova.compute.manager raise self.value [ 692.490875] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 692.490875] env[60164]: ERROR nova.compute.manager updated_port = self._update_port( [ 692.490875] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 692.490875] env[60164]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 692.491613] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 692.491613] env[60164]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 692.491613] env[60164]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 916b0b39-e552-4059-bbaf-20e8e06f1998, please check neutron logs for more information. [ 692.491613] env[60164]: ERROR nova.compute.manager [ 692.491613] env[60164]: Traceback (most recent call last): [ 692.491613] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 692.491613] env[60164]: listener.cb(fileno) [ 692.491613] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 692.491613] env[60164]: result = function(*args, **kwargs) [ 692.491613] env[60164]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 692.491613] env[60164]: return func(*args, **kwargs) [ 692.491613] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 692.491613] env[60164]: raise e [ 692.491613] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 692.491613] env[60164]: nwinfo = self.network_api.allocate_for_instance( [ 692.491613] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 692.491613] env[60164]: created_port_ids = self._update_ports_for_instance( [ 692.491613] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 692.491613] env[60164]: with excutils.save_and_reraise_exception(): [ 692.491613] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 692.491613] env[60164]: self.force_reraise() [ 692.491613] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 692.491613] env[60164]: raise self.value [ 692.491613] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 692.491613] env[60164]: updated_port = self._update_port( [ 692.491613] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 692.491613] env[60164]: _ensure_no_port_binding_failure(port) [ 692.491613] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 692.491613] env[60164]: raise exception.PortBindingFailed(port_id=port['id']) [ 692.492396] env[60164]: nova.exception.PortBindingFailed: Binding failed for port 916b0b39-e552-4059-bbaf-20e8e06f1998, please check neutron logs for more information. [ 692.492396] env[60164]: Removing descriptor: 20 [ 692.492396] env[60164]: ERROR nova.compute.manager [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 916b0b39-e552-4059-bbaf-20e8e06f1998, please check neutron logs for more information. [ 692.492396] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] Traceback (most recent call last): [ 692.492396] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] File "/opt/stack/nova/nova/compute/manager.py", line 2849, in _build_resources [ 692.492396] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] yield resources [ 692.492396] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 692.492396] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] self.driver.spawn(context, instance, image_meta, [ 692.492396] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 692.492396] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] self._vmops.spawn(context, instance, image_meta, injected_files, [ 692.492396] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 692.492396] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] vm_ref = self.build_virtual_machine(instance, [ 692.492723] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 692.492723] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] vif_infos = vmwarevif.get_vif_info(self._session, [ 692.492723] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 692.492723] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] for vif in network_info: [ 692.492723] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 692.492723] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] return self._sync_wrapper(fn, *args, **kwargs) [ 692.492723] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 692.492723] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] self.wait() [ 692.492723] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 692.492723] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] self[:] = self._gt.wait() [ 692.492723] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 692.492723] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] return self._exit_event.wait() [ 692.492723] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 692.492723] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] result = hub.switch() [ 692.493102] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 692.493102] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] return self.greenlet.switch() [ 692.493102] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 692.493102] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] result = function(*args, **kwargs) [ 692.493102] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 692.493102] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] return func(*args, **kwargs) [ 692.493102] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 692.493102] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] raise e [ 692.493102] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 692.493102] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] nwinfo = self.network_api.allocate_for_instance( [ 692.493102] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 692.493102] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] created_port_ids = self._update_ports_for_instance( [ 692.493102] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 692.493464] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] with excutils.save_and_reraise_exception(): [ 692.493464] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 692.493464] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] self.force_reraise() [ 692.493464] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 692.493464] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] raise self.value [ 692.493464] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 692.493464] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] updated_port = self._update_port( [ 692.493464] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 692.493464] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] _ensure_no_port_binding_failure(port) [ 692.493464] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 692.493464] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] raise exception.PortBindingFailed(port_id=port['id']) [ 692.493464] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] nova.exception.PortBindingFailed: Binding failed for port 916b0b39-e552-4059-bbaf-20e8e06f1998, please check neutron logs for more information. [ 692.493464] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] [ 692.493823] env[60164]: INFO nova.compute.manager [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] Terminating instance [ 692.497094] env[60164]: DEBUG oslo_concurrency.lockutils [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Acquiring lock "refresh_cache-e46ad9d2-d215-4205-b0c1-44726b08cb45" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 692.497334] env[60164]: DEBUG oslo_concurrency.lockutils [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Acquired lock "refresh_cache-e46ad9d2-d215-4205-b0c1-44726b08cb45" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 692.497439] env[60164]: DEBUG nova.network.neutron [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 692.587211] env[60164]: DEBUG nova.network.neutron [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 693.164129] env[60164]: DEBUG nova.network.neutron [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 693.179412] env[60164]: DEBUG oslo_concurrency.lockutils [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Releasing lock "refresh_cache-e46ad9d2-d215-4205-b0c1-44726b08cb45" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 693.180636] env[60164]: DEBUG nova.compute.manager [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] Start destroying the instance on the hypervisor. {{(pid=60164) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} [ 693.183941] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] Destroying instance {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 693.183941] env[60164]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-2c18816f-860c-490b-a671-5493123a439b {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 693.196668] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aca375bc-5986-4157-a03e-13e8fd4df4df {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 693.231996] env[60164]: WARNING nova.virt.vmwareapi.vmops [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance e46ad9d2-d215-4205-b0c1-44726b08cb45 could not be found. [ 693.232292] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] Instance destroyed {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 693.232503] env[60164]: INFO nova.compute.manager [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] Took 0.05 seconds to destroy the instance on the hypervisor. [ 693.232684] env[60164]: DEBUG oslo.service.loopingcall [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60164) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 693.232913] env[60164]: DEBUG nova.compute.manager [-] [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 693.233009] env[60164]: DEBUG nova.network.neutron [-] [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 693.303479] env[60164]: DEBUG nova.network.neutron [-] [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 693.309432] env[60164]: DEBUG nova.network.neutron [-] [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 693.325518] env[60164]: INFO nova.compute.manager [-] [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] Took 0.09 seconds to deallocate network for instance. [ 693.332797] env[60164]: DEBUG nova.compute.claims [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] Aborting claim: {{(pid=60164) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 693.332797] env[60164]: DEBUG oslo_concurrency.lockutils [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 693.332797] env[60164]: DEBUG oslo_concurrency.lockutils [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 693.357047] env[60164]: DEBUG nova.network.neutron [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] Successfully created port: c4b075f2-0c8a-430a-9b5a-9817c1efc2b3 {{(pid=60164) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 693.513537] env[60164]: DEBUG oslo_concurrency.lockutils [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] Acquiring lock "7e356a2e-b299-4801-af74-f536a12489fc" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 693.515147] env[60164]: DEBUG oslo_concurrency.lockutils [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] Lock "7e356a2e-b299-4801-af74-f536a12489fc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 693.526512] env[60164]: DEBUG nova.compute.manager [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] [instance: 7e356a2e-b299-4801-af74-f536a12489fc] Starting instance... {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} [ 693.586300] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f50b7476-4469-40a2-88dd-9aeee9212969 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 693.601649] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60430888-e258-4d89-bbb2-230830bc3792 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 693.642049] env[60164]: DEBUG oslo_concurrency.lockutils [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 693.642923] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-361b5e1b-093b-4031-b13f-f628e720236f {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 693.652485] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9354f3d-4a3e-4c5f-9e02-3912fb827274 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 693.670192] env[60164]: DEBUG nova.compute.provider_tree [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 693.686811] env[60164]: DEBUG nova.scheduler.client.report [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 693.712098] env[60164]: DEBUG oslo_concurrency.lockutils [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.382s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 693.712718] env[60164]: ERROR nova.compute.manager [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 916b0b39-e552-4059-bbaf-20e8e06f1998, please check neutron logs for more information. [ 693.712718] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] Traceback (most recent call last): [ 693.712718] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 693.712718] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] self.driver.spawn(context, instance, image_meta, [ 693.712718] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 693.712718] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] self._vmops.spawn(context, instance, image_meta, injected_files, [ 693.712718] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 693.712718] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] vm_ref = self.build_virtual_machine(instance, [ 693.712718] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 693.712718] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] vif_infos = vmwarevif.get_vif_info(self._session, [ 693.712718] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 693.716306] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] for vif in network_info: [ 693.716306] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 693.716306] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] return self._sync_wrapper(fn, *args, **kwargs) [ 693.716306] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 693.716306] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] self.wait() [ 693.716306] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 693.716306] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] self[:] = self._gt.wait() [ 693.716306] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 693.716306] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] return self._exit_event.wait() [ 693.716306] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 693.716306] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] result = hub.switch() [ 693.716306] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 693.716306] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] return self.greenlet.switch() [ 693.716306] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 693.716690] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] result = function(*args, **kwargs) [ 693.716690] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 693.716690] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] return func(*args, **kwargs) [ 693.716690] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 693.716690] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] raise e [ 693.716690] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 693.716690] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] nwinfo = self.network_api.allocate_for_instance( [ 693.716690] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 693.716690] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] created_port_ids = self._update_ports_for_instance( [ 693.716690] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 693.716690] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] with excutils.save_and_reraise_exception(): [ 693.716690] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 693.716690] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] self.force_reraise() [ 693.717020] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 693.717020] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] raise self.value [ 693.717020] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 693.717020] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] updated_port = self._update_port( [ 693.717020] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 693.717020] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] _ensure_no_port_binding_failure(port) [ 693.717020] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 693.717020] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] raise exception.PortBindingFailed(port_id=port['id']) [ 693.717020] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] nova.exception.PortBindingFailed: Binding failed for port 916b0b39-e552-4059-bbaf-20e8e06f1998, please check neutron logs for more information. [ 693.717020] env[60164]: ERROR nova.compute.manager [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] [ 693.717020] env[60164]: DEBUG nova.compute.utils [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] Binding failed for port 916b0b39-e552-4059-bbaf-20e8e06f1998, please check neutron logs for more information. {{(pid=60164) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 693.717326] env[60164]: DEBUG oslo_concurrency.lockutils [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.073s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 693.719806] env[60164]: INFO nova.compute.claims [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] [instance: 7e356a2e-b299-4801-af74-f536a12489fc] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 693.723048] env[60164]: DEBUG nova.compute.manager [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] Build of instance e46ad9d2-d215-4205-b0c1-44726b08cb45 was re-scheduled: Binding failed for port 916b0b39-e552-4059-bbaf-20e8e06f1998, please check neutron logs for more information. {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 693.723526] env[60164]: DEBUG nova.compute.manager [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] Unplugging VIFs for instance {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} [ 693.723740] env[60164]: DEBUG oslo_concurrency.lockutils [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Acquiring lock "refresh_cache-e46ad9d2-d215-4205-b0c1-44726b08cb45" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 693.723876] env[60164]: DEBUG oslo_concurrency.lockutils [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Acquired lock "refresh_cache-e46ad9d2-d215-4205-b0c1-44726b08cb45" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 693.724049] env[60164]: DEBUG nova.network.neutron [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 693.787684] env[60164]: DEBUG nova.network.neutron [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 693.956375] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6c62b21-0f8c-4a19-a3d7-e5e654e43f36 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 693.966261] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53788799-4f80-4b07-9afa-68639a9eaf05 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 694.003866] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3b4e0e5-59dd-497b-816c-b1d28d478fa2 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 694.012266] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96386440-bc6c-4295-9872-46f670fa0e1f {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 694.026696] env[60164]: DEBUG nova.compute.provider_tree [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 694.034756] env[60164]: DEBUG nova.scheduler.client.report [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 694.055788] env[60164]: DEBUG oslo_concurrency.lockutils [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.339s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 694.055788] env[60164]: DEBUG nova.compute.manager [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] [instance: 7e356a2e-b299-4801-af74-f536a12489fc] Start building networks asynchronously for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} [ 694.100820] env[60164]: DEBUG nova.compute.utils [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] Using /dev/sd instead of None {{(pid=60164) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 694.102725] env[60164]: DEBUG nova.compute.manager [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] [instance: 7e356a2e-b299-4801-af74-f536a12489fc] Allocating IP information in the background. {{(pid=60164) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 694.102987] env[60164]: DEBUG nova.network.neutron [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] [instance: 7e356a2e-b299-4801-af74-f536a12489fc] allocate_for_instance() {{(pid=60164) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1143}} [ 694.117850] env[60164]: DEBUG nova.compute.manager [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] [instance: 7e356a2e-b299-4801-af74-f536a12489fc] Start building block device mappings for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} [ 694.186656] env[60164]: DEBUG nova.network.neutron [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 694.197458] env[60164]: DEBUG oslo_concurrency.lockutils [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Releasing lock "refresh_cache-e46ad9d2-d215-4205-b0c1-44726b08cb45" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 694.197771] env[60164]: DEBUG nova.compute.manager [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2984}} [ 694.198052] env[60164]: DEBUG nova.compute.manager [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 694.198331] env[60164]: DEBUG nova.network.neutron [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 694.210045] env[60164]: DEBUG nova.compute.manager [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] [instance: 7e356a2e-b299-4801-af74-f536a12489fc] Start spawning the instance on the hypervisor. {{(pid=60164) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} [ 694.239987] env[60164]: DEBUG nova.virt.hardware [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-15T19:01:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-15T19:01:11Z,direct_url=,disk_format='vmdk',id=1618eb55-f00d-42a5-b978-e81e57855fb4,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='4c19b57a5b764a6480c1f6e2a9cdd5e4',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-15T19:01:12Z,virtual_size=,visibility=), allow threads: False {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 694.245015] env[60164]: DEBUG nova.virt.hardware [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] Flavor limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 694.245015] env[60164]: DEBUG nova.virt.hardware [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] Image limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 694.245015] env[60164]: DEBUG nova.virt.hardware [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] Flavor pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 694.245015] env[60164]: DEBUG nova.virt.hardware [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] Image pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 694.245015] env[60164]: DEBUG nova.virt.hardware [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 694.245288] env[60164]: DEBUG nova.virt.hardware [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 694.245288] env[60164]: DEBUG nova.virt.hardware [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 694.245288] env[60164]: DEBUG nova.virt.hardware [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] Got 1 possible topologies {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 694.245288] env[60164]: DEBUG nova.virt.hardware [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 694.245288] env[60164]: DEBUG nova.virt.hardware [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 694.245548] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7020a52b-ca22-408c-82aa-2aa347db1c00 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 694.257211] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40c3ebd8-20d0-40ca-a1f3-9fc9b4d6a4d0 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 694.261921] env[60164]: DEBUG nova.network.neutron [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 694.274772] env[60164]: DEBUG nova.network.neutron [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 694.284819] env[60164]: INFO nova.compute.manager [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: e46ad9d2-d215-4205-b0c1-44726b08cb45] Took 0.09 seconds to deallocate network for instance. [ 694.383617] env[60164]: INFO nova.scheduler.client.report [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Deleted allocations for instance e46ad9d2-d215-4205-b0c1-44726b08cb45 [ 694.403618] env[60164]: DEBUG oslo_concurrency.lockutils [None req-f150b914-55df-4af9-b649-b789a293f473 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Lock "e46ad9d2-d215-4205-b0c1-44726b08cb45" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 16.772s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 694.456063] env[60164]: DEBUG nova.network.neutron [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] Successfully created port: 93c5b658-4e25-4747-b1bb-79ac51446057 {{(pid=60164) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 694.672185] env[60164]: DEBUG nova.policy [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cb884ffadf2145ad959c0b159464b7b9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'df737b66de7f4333a492593a77abd42d', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60164) authorize /opt/stack/nova/nova/policy.py:203}} [ 695.669967] env[60164]: ERROR nova.compute.manager [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 6ea2b6b0-f8d9-4d8e-8406-96a2d5c43db7, please check neutron logs for more information. [ 695.669967] env[60164]: ERROR nova.compute.manager Traceback (most recent call last): [ 695.669967] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 695.669967] env[60164]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 695.669967] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 695.669967] env[60164]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 695.669967] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 695.669967] env[60164]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 695.669967] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 695.669967] env[60164]: ERROR nova.compute.manager self.force_reraise() [ 695.669967] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 695.669967] env[60164]: ERROR nova.compute.manager raise self.value [ 695.669967] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 695.669967] env[60164]: ERROR nova.compute.manager updated_port = self._update_port( [ 695.669967] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 695.669967] env[60164]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 695.670637] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 695.670637] env[60164]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 695.670637] env[60164]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 6ea2b6b0-f8d9-4d8e-8406-96a2d5c43db7, please check neutron logs for more information. [ 695.670637] env[60164]: ERROR nova.compute.manager [ 695.670637] env[60164]: Traceback (most recent call last): [ 695.670637] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 695.670637] env[60164]: listener.cb(fileno) [ 695.670637] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 695.670637] env[60164]: result = function(*args, **kwargs) [ 695.670637] env[60164]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 695.670637] env[60164]: return func(*args, **kwargs) [ 695.670637] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 695.670637] env[60164]: raise e [ 695.670637] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 695.670637] env[60164]: nwinfo = self.network_api.allocate_for_instance( [ 695.670637] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 695.670637] env[60164]: created_port_ids = self._update_ports_for_instance( [ 695.670637] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 695.670637] env[60164]: with excutils.save_and_reraise_exception(): [ 695.670637] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 695.670637] env[60164]: self.force_reraise() [ 695.670637] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 695.670637] env[60164]: raise self.value [ 695.670637] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 695.670637] env[60164]: updated_port = self._update_port( [ 695.670637] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 695.670637] env[60164]: _ensure_no_port_binding_failure(port) [ 695.670637] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 695.670637] env[60164]: raise exception.PortBindingFailed(port_id=port['id']) [ 695.671873] env[60164]: nova.exception.PortBindingFailed: Binding failed for port 6ea2b6b0-f8d9-4d8e-8406-96a2d5c43db7, please check neutron logs for more information. [ 695.671873] env[60164]: Removing descriptor: 19 [ 695.671873] env[60164]: ERROR nova.compute.manager [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 6ea2b6b0-f8d9-4d8e-8406-96a2d5c43db7, please check neutron logs for more information. [ 695.671873] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] Traceback (most recent call last): [ 695.671873] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] File "/opt/stack/nova/nova/compute/manager.py", line 2849, in _build_resources [ 695.671873] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] yield resources [ 695.671873] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 695.671873] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] self.driver.spawn(context, instance, image_meta, [ 695.671873] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 695.671873] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 695.671873] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 695.671873] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] vm_ref = self.build_virtual_machine(instance, [ 695.672500] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 695.672500] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] vif_infos = vmwarevif.get_vif_info(self._session, [ 695.672500] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 695.672500] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] for vif in network_info: [ 695.672500] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 695.672500] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] return self._sync_wrapper(fn, *args, **kwargs) [ 695.672500] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 695.672500] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] self.wait() [ 695.672500] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 695.672500] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] self[:] = self._gt.wait() [ 695.672500] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 695.672500] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] return self._exit_event.wait() [ 695.672500] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 695.672500] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] result = hub.switch() [ 695.673472] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 695.673472] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] return self.greenlet.switch() [ 695.673472] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 695.673472] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] result = function(*args, **kwargs) [ 695.673472] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 695.673472] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] return func(*args, **kwargs) [ 695.673472] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 695.673472] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] raise e [ 695.673472] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 695.673472] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] nwinfo = self.network_api.allocate_for_instance( [ 695.673472] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 695.673472] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] created_port_ids = self._update_ports_for_instance( [ 695.673472] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 695.673950] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] with excutils.save_and_reraise_exception(): [ 695.673950] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 695.673950] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] self.force_reraise() [ 695.673950] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 695.673950] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] raise self.value [ 695.673950] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 695.673950] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] updated_port = self._update_port( [ 695.673950] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 695.673950] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] _ensure_no_port_binding_failure(port) [ 695.673950] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 695.673950] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] raise exception.PortBindingFailed(port_id=port['id']) [ 695.673950] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] nova.exception.PortBindingFailed: Binding failed for port 6ea2b6b0-f8d9-4d8e-8406-96a2d5c43db7, please check neutron logs for more information. [ 695.673950] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] [ 695.675166] env[60164]: INFO nova.compute.manager [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] Terminating instance [ 695.675166] env[60164]: DEBUG oslo_concurrency.lockutils [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Acquiring lock "refresh_cache-4c545ed0-7442-43db-a96f-4d7f1b785c4d" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 695.675166] env[60164]: DEBUG oslo_concurrency.lockutils [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Acquired lock "refresh_cache-4c545ed0-7442-43db-a96f-4d7f1b785c4d" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 695.675166] env[60164]: DEBUG nova.network.neutron [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 695.766660] env[60164]: DEBUG nova.network.neutron [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 696.508503] env[60164]: DEBUG nova.network.neutron [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 696.524204] env[60164]: DEBUG oslo_concurrency.lockutils [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Releasing lock "refresh_cache-4c545ed0-7442-43db-a96f-4d7f1b785c4d" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 696.525203] env[60164]: DEBUG nova.compute.manager [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] Start destroying the instance on the hypervisor. {{(pid=60164) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} [ 696.525469] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] Destroying instance {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 696.526014] env[60164]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-9b5793e5-01a4-485b-b58f-d9ea62c61fa6 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.543020] env[60164]: DEBUG oslo_concurrency.lockutils [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Acquiring lock "e52a3adf-4654-43cd-8613-749277053ea8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 696.543020] env[60164]: DEBUG oslo_concurrency.lockutils [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Lock "e52a3adf-4654-43cd-8613-749277053ea8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 696.553046] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5dcbc90a-539d-4f80-9cbd-9ffcd6d0d8c8 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.568011] env[60164]: DEBUG nova.compute.manager [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: e52a3adf-4654-43cd-8613-749277053ea8] Starting instance... {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} [ 696.589808] env[60164]: WARNING nova.virt.vmwareapi.vmops [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 4c545ed0-7442-43db-a96f-4d7f1b785c4d could not be found. [ 696.589808] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] Instance destroyed {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 696.589808] env[60164]: INFO nova.compute.manager [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] Took 0.06 seconds to destroy the instance on the hypervisor. [ 696.589808] env[60164]: DEBUG oslo.service.loopingcall [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60164) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 696.589808] env[60164]: DEBUG nova.compute.manager [-] [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 696.590046] env[60164]: DEBUG nova.network.neutron [-] [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 696.634820] env[60164]: DEBUG oslo_concurrency.lockutils [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 696.634820] env[60164]: DEBUG oslo_concurrency.lockutils [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 696.636344] env[60164]: INFO nova.compute.claims [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: e52a3adf-4654-43cd-8613-749277053ea8] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 696.685795] env[60164]: DEBUG nova.network.neutron [-] [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 696.699945] env[60164]: DEBUG nova.network.neutron [-] [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 696.751563] env[60164]: INFO nova.compute.manager [-] [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] Took 0.16 seconds to deallocate network for instance. [ 696.757493] env[60164]: DEBUG nova.compute.claims [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] Aborting claim: {{(pid=60164) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 696.757662] env[60164]: DEBUG oslo_concurrency.lockutils [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 696.778393] env[60164]: DEBUG oslo_concurrency.lockutils [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] Acquiring lock "ef7b219e-437d-4b15-b559-ca5e2405efb2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 696.778780] env[60164]: DEBUG oslo_concurrency.lockutils [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] Lock "ef7b219e-437d-4b15-b559-ca5e2405efb2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 696.789926] env[60164]: DEBUG nova.compute.manager [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] Starting instance... {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} [ 696.860062] env[60164]: DEBUG oslo_concurrency.lockutils [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 696.913515] env[60164]: DEBUG nova.network.neutron [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] [instance: 7e356a2e-b299-4801-af74-f536a12489fc] Successfully created port: e641e291-cbd8-404d-ab1d-4dee8d7969cd {{(pid=60164) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 696.920511] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ff25269-621f-4d75-aea2-eb33c35423f4 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.928633] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0863c00a-c10f-49c4-ad0c-9cbcf1879185 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.962938] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2c7dc94-2f5e-42e0-81ea-3d39d9db5874 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.972071] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad60eee0-b57a-4b0a-b0a3-291235074cb6 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.988020] env[60164]: DEBUG nova.compute.provider_tree [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 697.001017] env[60164]: DEBUG nova.scheduler.client.report [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 697.024278] env[60164]: DEBUG oslo_concurrency.lockutils [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.389s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 697.024799] env[60164]: DEBUG nova.compute.manager [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: e52a3adf-4654-43cd-8613-749277053ea8] Start building networks asynchronously for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} [ 697.028192] env[60164]: DEBUG oslo_concurrency.lockutils [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.270s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 697.071830] env[60164]: DEBUG nova.compute.utils [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Using /dev/sd instead of None {{(pid=60164) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 697.074962] env[60164]: DEBUG nova.compute.manager [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: e52a3adf-4654-43cd-8613-749277053ea8] Allocating IP information in the background. {{(pid=60164) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 697.075163] env[60164]: DEBUG nova.network.neutron [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: e52a3adf-4654-43cd-8613-749277053ea8] allocate_for_instance() {{(pid=60164) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1143}} [ 697.091193] env[60164]: DEBUG nova.compute.manager [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: e52a3adf-4654-43cd-8613-749277053ea8] Start building block device mappings for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} [ 697.176730] env[60164]: DEBUG nova.compute.manager [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: e52a3adf-4654-43cd-8613-749277053ea8] Start spawning the instance on the hypervisor. {{(pid=60164) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} [ 697.202707] env[60164]: DEBUG nova.virt.hardware [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-15T19:01:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-15T19:01:11Z,direct_url=,disk_format='vmdk',id=1618eb55-f00d-42a5-b978-e81e57855fb4,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='4c19b57a5b764a6480c1f6e2a9cdd5e4',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-15T19:01:12Z,virtual_size=,visibility=), allow threads: False {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 697.203011] env[60164]: DEBUG nova.virt.hardware [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Flavor limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 697.203210] env[60164]: DEBUG nova.virt.hardware [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Image limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 697.203645] env[60164]: DEBUG nova.virt.hardware [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Flavor pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 697.203883] env[60164]: DEBUG nova.virt.hardware [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Image pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 697.204111] env[60164]: DEBUG nova.virt.hardware [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 697.204328] env[60164]: DEBUG nova.virt.hardware [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 697.204520] env[60164]: DEBUG nova.virt.hardware [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 697.204725] env[60164]: DEBUG nova.virt.hardware [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Got 1 possible topologies {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 697.204919] env[60164]: DEBUG nova.virt.hardware [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 697.205144] env[60164]: DEBUG nova.virt.hardware [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 697.206736] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2aa221e7-7682-47f0-ae1e-b3d3dcef09e8 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.223789] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-357ff611-f843-4b9e-86aa-e1d330dcb210 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.258463] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05d0f319-f30f-4016-b7bd-56b2e426f5b1 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.265781] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de046061-5308-4e31-9ec3-944327f5624e {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.302862] env[60164]: DEBUG nova.policy [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '38afd0b6a9a24556bd374e62b3363f3d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '15c6dd50db8b44d190cdaaf8e69222da', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60164) authorize /opt/stack/nova/nova/policy.py:203}} [ 697.305094] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92194223-f8ef-44b9-b30d-da82639aa86f {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.314261] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-186b031f-87d5-4b12-ad5f-685207375703 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.329930] env[60164]: DEBUG nova.compute.provider_tree [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 697.345605] env[60164]: DEBUG nova.scheduler.client.report [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 697.365789] env[60164]: DEBUG oslo_concurrency.lockutils [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.338s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 697.366437] env[60164]: ERROR nova.compute.manager [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 6ea2b6b0-f8d9-4d8e-8406-96a2d5c43db7, please check neutron logs for more information. [ 697.366437] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] Traceback (most recent call last): [ 697.366437] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 697.366437] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] self.driver.spawn(context, instance, image_meta, [ 697.366437] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 697.366437] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 697.366437] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 697.366437] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] vm_ref = self.build_virtual_machine(instance, [ 697.366437] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 697.366437] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] vif_infos = vmwarevif.get_vif_info(self._session, [ 697.366437] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 697.366931] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] for vif in network_info: [ 697.366931] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 697.366931] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] return self._sync_wrapper(fn, *args, **kwargs) [ 697.366931] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 697.366931] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] self.wait() [ 697.366931] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 697.366931] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] self[:] = self._gt.wait() [ 697.366931] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 697.366931] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] return self._exit_event.wait() [ 697.366931] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 697.366931] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] result = hub.switch() [ 697.366931] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 697.366931] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] return self.greenlet.switch() [ 697.366931] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 697.367344] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] result = function(*args, **kwargs) [ 697.367344] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 697.367344] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] return func(*args, **kwargs) [ 697.367344] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 697.367344] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] raise e [ 697.367344] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 697.367344] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] nwinfo = self.network_api.allocate_for_instance( [ 697.367344] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 697.367344] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] created_port_ids = self._update_ports_for_instance( [ 697.367344] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 697.367344] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] with excutils.save_and_reraise_exception(): [ 697.367344] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 697.367344] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] self.force_reraise() [ 697.367678] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 697.367678] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] raise self.value [ 697.367678] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 697.367678] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] updated_port = self._update_port( [ 697.367678] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 697.367678] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] _ensure_no_port_binding_failure(port) [ 697.367678] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 697.367678] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] raise exception.PortBindingFailed(port_id=port['id']) [ 697.367678] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] nova.exception.PortBindingFailed: Binding failed for port 6ea2b6b0-f8d9-4d8e-8406-96a2d5c43db7, please check neutron logs for more information. [ 697.367678] env[60164]: ERROR nova.compute.manager [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] [ 697.367678] env[60164]: DEBUG nova.compute.utils [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] Binding failed for port 6ea2b6b0-f8d9-4d8e-8406-96a2d5c43db7, please check neutron logs for more information. {{(pid=60164) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 697.369192] env[60164]: DEBUG nova.compute.manager [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] Build of instance 4c545ed0-7442-43db-a96f-4d7f1b785c4d was re-scheduled: Binding failed for port 6ea2b6b0-f8d9-4d8e-8406-96a2d5c43db7, please check neutron logs for more information. {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 697.369808] env[60164]: DEBUG nova.compute.manager [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] Unplugging VIFs for instance {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} [ 697.369945] env[60164]: DEBUG oslo_concurrency.lockutils [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Acquiring lock "refresh_cache-4c545ed0-7442-43db-a96f-4d7f1b785c4d" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 697.369988] env[60164]: DEBUG oslo_concurrency.lockutils [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Acquired lock "refresh_cache-4c545ed0-7442-43db-a96f-4d7f1b785c4d" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 697.370149] env[60164]: DEBUG nova.network.neutron [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 697.372333] env[60164]: DEBUG oslo_concurrency.lockutils [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.513s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 697.375517] env[60164]: INFO nova.compute.claims [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 697.630684] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1ab4ded-ed7f-4b54-b174-d83c7804adf6 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.641338] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c9122cd-2308-4b11-b810-c24ea1d511c2 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.684043] env[60164]: DEBUG nova.network.neutron [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 697.686776] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-abbffae9-f8b4-4e3e-b9e4-8d9edc747437 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.696830] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13df591f-376d-4760-a366-f62dca7606d9 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.712642] env[60164]: DEBUG nova.compute.provider_tree [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 697.725876] env[60164]: DEBUG nova.scheduler.client.report [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 697.744832] env[60164]: DEBUG oslo_concurrency.lockutils [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.372s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 697.745359] env[60164]: DEBUG nova.compute.manager [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] Start building networks asynchronously for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} [ 697.789574] env[60164]: DEBUG nova.compute.utils [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] Using /dev/sd instead of None {{(pid=60164) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 697.790854] env[60164]: DEBUG nova.compute.manager [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] Allocating IP information in the background. {{(pid=60164) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 697.790897] env[60164]: DEBUG nova.network.neutron [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] allocate_for_instance() {{(pid=60164) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1143}} [ 697.801834] env[60164]: DEBUG nova.compute.manager [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] Start building block device mappings for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} [ 697.895055] env[60164]: DEBUG nova.compute.manager [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] Start spawning the instance on the hypervisor. {{(pid=60164) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} [ 697.923812] env[60164]: DEBUG nova.virt.hardware [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-15T19:01:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-15T19:01:11Z,direct_url=,disk_format='vmdk',id=1618eb55-f00d-42a5-b978-e81e57855fb4,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='4c19b57a5b764a6480c1f6e2a9cdd5e4',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-15T19:01:12Z,virtual_size=,visibility=), allow threads: False {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 697.924042] env[60164]: DEBUG nova.virt.hardware [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] Flavor limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 697.924702] env[60164]: DEBUG nova.virt.hardware [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] Image limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 697.924702] env[60164]: DEBUG nova.virt.hardware [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] Flavor pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 697.924702] env[60164]: DEBUG nova.virt.hardware [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] Image pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 697.924702] env[60164]: DEBUG nova.virt.hardware [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 697.924878] env[60164]: DEBUG nova.virt.hardware [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 697.924962] env[60164]: DEBUG nova.virt.hardware [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 697.925139] env[60164]: DEBUG nova.virt.hardware [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] Got 1 possible topologies {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 697.925297] env[60164]: DEBUG nova.virt.hardware [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 697.925467] env[60164]: DEBUG nova.virt.hardware [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 697.926491] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62eb5c9e-981a-4871-9b1f-962e113e12da {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.935120] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-947c4a1b-7c68-4ad8-a20f-d14279eaf49d {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 698.119951] env[60164]: DEBUG nova.policy [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ea6be32321a24a638ad7079b323457de', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '61a1fea64b51425d840cb421206a4191', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60164) authorize /opt/stack/nova/nova/policy.py:203}} [ 698.340442] env[60164]: DEBUG nova.network.neutron [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 698.349995] env[60164]: DEBUG oslo_concurrency.lockutils [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Releasing lock "refresh_cache-4c545ed0-7442-43db-a96f-4d7f1b785c4d" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 698.350249] env[60164]: DEBUG nova.compute.manager [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2984}} [ 698.350435] env[60164]: DEBUG nova.compute.manager [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 698.350592] env[60164]: DEBUG nova.network.neutron [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 698.428325] env[60164]: DEBUG nova.network.neutron [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 698.437970] env[60164]: DEBUG nova.network.neutron [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 698.448168] env[60164]: INFO nova.compute.manager [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 4c545ed0-7442-43db-a96f-4d7f1b785c4d] Took 0.10 seconds to deallocate network for instance. [ 698.566681] env[60164]: INFO nova.scheduler.client.report [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Deleted allocations for instance 4c545ed0-7442-43db-a96f-4d7f1b785c4d [ 698.587801] env[60164]: DEBUG oslo_concurrency.lockutils [None req-932f1f02-6b5d-44c3-a7cc-23843d41d098 tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Lock "4c545ed0-7442-43db-a96f-4d7f1b785c4d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 15.329s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 698.751253] env[60164]: DEBUG oslo_concurrency.lockutils [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] Acquiring lock "9bee98ef-48b4-47e6-8afb-e535e58e50cb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 698.751646] env[60164]: DEBUG oslo_concurrency.lockutils [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] Lock "9bee98ef-48b4-47e6-8afb-e535e58e50cb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 698.760784] env[60164]: DEBUG nova.compute.manager [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] Starting instance... {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} [ 698.815293] env[60164]: DEBUG oslo_concurrency.lockutils [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 698.815571] env[60164]: DEBUG oslo_concurrency.lockutils [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 698.817061] env[60164]: INFO nova.compute.claims [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 699.073300] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ccbae77-ca15-49f3-a0eb-d3805325160e {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 699.082174] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2f56a79-d7f2-4849-a29b-6b0f4741451a {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 699.127391] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a1a5506-eac3-41bf-9357-adec148c9e48 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 699.136956] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef8baa4c-29f4-4af2-bd11-eaba36f9ec3c {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 699.154026] env[60164]: DEBUG nova.compute.provider_tree [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 699.165430] env[60164]: DEBUG nova.scheduler.client.report [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 699.196528] env[60164]: DEBUG oslo_concurrency.lockutils [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.381s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 699.197795] env[60164]: DEBUG nova.compute.manager [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] Start building networks asynchronously for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} [ 699.247090] env[60164]: DEBUG nova.compute.utils [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] Using /dev/sd instead of None {{(pid=60164) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 699.251224] env[60164]: DEBUG nova.compute.manager [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] Allocating IP information in the background. {{(pid=60164) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 699.251456] env[60164]: DEBUG nova.network.neutron [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] allocate_for_instance() {{(pid=60164) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1143}} [ 699.274889] env[60164]: DEBUG nova.compute.manager [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] Start building block device mappings for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} [ 699.349319] env[60164]: DEBUG nova.network.neutron [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: e52a3adf-4654-43cd-8613-749277053ea8] Successfully created port: 53857d03-68f5-47d9-b7d9-b532a6d42fcf {{(pid=60164) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 699.374445] env[60164]: DEBUG nova.compute.manager [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] Start spawning the instance on the hypervisor. {{(pid=60164) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} [ 699.389719] env[60164]: DEBUG oslo_concurrency.lockutils [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] Acquiring lock "9e88b24c-500d-4efb-8563-093dd4d0378d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 699.389719] env[60164]: DEBUG oslo_concurrency.lockutils [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] Lock "9e88b24c-500d-4efb-8563-093dd4d0378d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 699.403349] env[60164]: DEBUG nova.virt.hardware [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-15T19:01:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-15T19:01:11Z,direct_url=,disk_format='vmdk',id=1618eb55-f00d-42a5-b978-e81e57855fb4,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='4c19b57a5b764a6480c1f6e2a9cdd5e4',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-15T19:01:12Z,virtual_size=,visibility=), allow threads: False {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 699.404030] env[60164]: DEBUG nova.virt.hardware [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] Flavor limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 699.404030] env[60164]: DEBUG nova.virt.hardware [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] Image limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 699.404030] env[60164]: DEBUG nova.virt.hardware [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] Flavor pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 699.404030] env[60164]: DEBUG nova.virt.hardware [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] Image pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 699.404210] env[60164]: DEBUG nova.virt.hardware [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 699.404367] env[60164]: DEBUG nova.virt.hardware [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 699.406416] env[60164]: DEBUG nova.virt.hardware [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 699.406416] env[60164]: DEBUG nova.virt.hardware [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] Got 1 possible topologies {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 699.406416] env[60164]: DEBUG nova.virt.hardware [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 699.406416] env[60164]: DEBUG nova.virt.hardware [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 699.406416] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9632ab29-e25a-43b3-af89-30c374be18b5 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 699.416902] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ea0407e-5b10-4ba6-bdf3-324570e10651 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 699.559943] env[60164]: DEBUG nova.policy [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2af3b82f197e4e7fbb0600b1eb0b34ef', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f32917678ead4147ab017ac9de07a145', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60164) authorize /opt/stack/nova/nova/policy.py:203}} [ 700.065019] env[60164]: DEBUG nova.network.neutron [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] Successfully created port: 05e585fb-54d8-4a6a-b92f-a1b020c55e21 {{(pid=60164) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 700.639036] env[60164]: ERROR nova.compute.manager [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port c4b075f2-0c8a-430a-9b5a-9817c1efc2b3, please check neutron logs for more information. [ 700.639036] env[60164]: ERROR nova.compute.manager Traceback (most recent call last): [ 700.639036] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 700.639036] env[60164]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 700.639036] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 700.639036] env[60164]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 700.639036] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 700.639036] env[60164]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 700.639036] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 700.639036] env[60164]: ERROR nova.compute.manager self.force_reraise() [ 700.639036] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 700.639036] env[60164]: ERROR nova.compute.manager raise self.value [ 700.639036] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 700.639036] env[60164]: ERROR nova.compute.manager updated_port = self._update_port( [ 700.639036] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 700.639036] env[60164]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 700.639553] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 700.639553] env[60164]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 700.639553] env[60164]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port c4b075f2-0c8a-430a-9b5a-9817c1efc2b3, please check neutron logs for more information. [ 700.639553] env[60164]: ERROR nova.compute.manager [ 700.639553] env[60164]: Traceback (most recent call last): [ 700.639553] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 700.639553] env[60164]: listener.cb(fileno) [ 700.639553] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 700.639553] env[60164]: result = function(*args, **kwargs) [ 700.639553] env[60164]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 700.639553] env[60164]: return func(*args, **kwargs) [ 700.639553] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 700.639553] env[60164]: raise e [ 700.639553] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 700.639553] env[60164]: nwinfo = self.network_api.allocate_for_instance( [ 700.639553] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 700.639553] env[60164]: created_port_ids = self._update_ports_for_instance( [ 700.639553] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 700.639553] env[60164]: with excutils.save_and_reraise_exception(): [ 700.639553] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 700.639553] env[60164]: self.force_reraise() [ 700.639553] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 700.639553] env[60164]: raise self.value [ 700.639553] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 700.639553] env[60164]: updated_port = self._update_port( [ 700.639553] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 700.639553] env[60164]: _ensure_no_port_binding_failure(port) [ 700.639553] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 700.639553] env[60164]: raise exception.PortBindingFailed(port_id=port['id']) [ 700.640293] env[60164]: nova.exception.PortBindingFailed: Binding failed for port c4b075f2-0c8a-430a-9b5a-9817c1efc2b3, please check neutron logs for more information. [ 700.640293] env[60164]: Removing descriptor: 14 [ 700.640293] env[60164]: ERROR nova.compute.manager [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port c4b075f2-0c8a-430a-9b5a-9817c1efc2b3, please check neutron logs for more information. [ 700.640293] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] Traceback (most recent call last): [ 700.640293] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] File "/opt/stack/nova/nova/compute/manager.py", line 2849, in _build_resources [ 700.640293] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] yield resources [ 700.640293] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 700.640293] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] self.driver.spawn(context, instance, image_meta, [ 700.640293] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 700.640293] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] self._vmops.spawn(context, instance, image_meta, injected_files, [ 700.640293] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 700.640293] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] vm_ref = self.build_virtual_machine(instance, [ 700.640626] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 700.640626] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] vif_infos = vmwarevif.get_vif_info(self._session, [ 700.640626] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 700.640626] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] for vif in network_info: [ 700.640626] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 700.640626] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] return self._sync_wrapper(fn, *args, **kwargs) [ 700.640626] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 700.640626] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] self.wait() [ 700.640626] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 700.640626] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] self[:] = self._gt.wait() [ 700.640626] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 700.640626] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] return self._exit_event.wait() [ 700.640626] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 700.640626] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] result = hub.switch() [ 700.640949] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 700.640949] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] return self.greenlet.switch() [ 700.640949] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 700.640949] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] result = function(*args, **kwargs) [ 700.640949] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 700.640949] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] return func(*args, **kwargs) [ 700.640949] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 700.640949] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] raise e [ 700.640949] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 700.640949] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] nwinfo = self.network_api.allocate_for_instance( [ 700.640949] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 700.640949] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] created_port_ids = self._update_ports_for_instance( [ 700.640949] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 700.641323] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] with excutils.save_and_reraise_exception(): [ 700.641323] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 700.641323] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] self.force_reraise() [ 700.641323] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 700.641323] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] raise self.value [ 700.641323] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 700.641323] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] updated_port = self._update_port( [ 700.641323] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 700.641323] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] _ensure_no_port_binding_failure(port) [ 700.641323] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 700.641323] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] raise exception.PortBindingFailed(port_id=port['id']) [ 700.641323] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] nova.exception.PortBindingFailed: Binding failed for port c4b075f2-0c8a-430a-9b5a-9817c1efc2b3, please check neutron logs for more information. [ 700.641323] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] [ 700.641741] env[60164]: INFO nova.compute.manager [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] Terminating instance [ 700.641741] env[60164]: DEBUG oslo_concurrency.lockutils [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] Acquiring lock "refresh_cache-5980cfa0-bdd6-4fca-a605-c857e0e7b886" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 700.641741] env[60164]: DEBUG oslo_concurrency.lockutils [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] Acquired lock "refresh_cache-5980cfa0-bdd6-4fca-a605-c857e0e7b886" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 700.641741] env[60164]: DEBUG nova.network.neutron [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 700.696501] env[60164]: DEBUG nova.network.neutron [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 701.205657] env[60164]: DEBUG nova.network.neutron [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 701.217462] env[60164]: DEBUG oslo_concurrency.lockutils [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] Releasing lock "refresh_cache-5980cfa0-bdd6-4fca-a605-c857e0e7b886" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 701.218257] env[60164]: DEBUG nova.compute.manager [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] Start destroying the instance on the hypervisor. {{(pid=60164) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} [ 701.219447] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] Destroying instance {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 701.221100] env[60164]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-ca60e1bf-f9e1-4343-95e6-08e167b9c829 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 701.236251] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb59c968-400e-4b90-a362-a0d48a268575 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 701.263601] env[60164]: WARNING nova.virt.vmwareapi.vmops [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 5980cfa0-bdd6-4fca-a605-c857e0e7b886 could not be found. [ 701.263943] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] Instance destroyed {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 701.264169] env[60164]: INFO nova.compute.manager [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] Took 0.05 seconds to destroy the instance on the hypervisor. [ 701.264896] env[60164]: DEBUG oslo.service.loopingcall [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60164) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 701.264896] env[60164]: DEBUG nova.compute.manager [-] [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 701.264896] env[60164]: DEBUG nova.network.neutron [-] [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 701.294376] env[60164]: DEBUG nova.network.neutron [-] [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 701.306596] env[60164]: DEBUG nova.network.neutron [-] [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 701.311888] env[60164]: DEBUG nova.network.neutron [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] Successfully created port: 00b78937-7d5e-4965-91a5-30f8f3c29b85 {{(pid=60164) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 701.331668] env[60164]: INFO nova.compute.manager [-] [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] Took 0.06 seconds to deallocate network for instance. [ 701.331668] env[60164]: DEBUG nova.compute.claims [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] Aborting claim: {{(pid=60164) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 701.331668] env[60164]: DEBUG oslo_concurrency.lockutils [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 701.331668] env[60164]: DEBUG oslo_concurrency.lockutils [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 701.535611] env[60164]: ERROR nova.compute.manager [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 93c5b658-4e25-4747-b1bb-79ac51446057, please check neutron logs for more information. [ 701.535611] env[60164]: ERROR nova.compute.manager Traceback (most recent call last): [ 701.535611] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 701.535611] env[60164]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 701.535611] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 701.535611] env[60164]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 701.535611] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 701.535611] env[60164]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 701.535611] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 701.535611] env[60164]: ERROR nova.compute.manager self.force_reraise() [ 701.535611] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 701.535611] env[60164]: ERROR nova.compute.manager raise self.value [ 701.535611] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 701.535611] env[60164]: ERROR nova.compute.manager updated_port = self._update_port( [ 701.535611] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 701.535611] env[60164]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 701.536148] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 701.536148] env[60164]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 701.536148] env[60164]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 93c5b658-4e25-4747-b1bb-79ac51446057, please check neutron logs for more information. [ 701.536148] env[60164]: ERROR nova.compute.manager [ 701.536148] env[60164]: Traceback (most recent call last): [ 701.536148] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 701.536148] env[60164]: listener.cb(fileno) [ 701.536148] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 701.536148] env[60164]: result = function(*args, **kwargs) [ 701.536148] env[60164]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 701.536148] env[60164]: return func(*args, **kwargs) [ 701.536148] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 701.536148] env[60164]: raise e [ 701.536148] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 701.536148] env[60164]: nwinfo = self.network_api.allocate_for_instance( [ 701.536148] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 701.536148] env[60164]: created_port_ids = self._update_ports_for_instance( [ 701.536148] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 701.536148] env[60164]: with excutils.save_and_reraise_exception(): [ 701.536148] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 701.536148] env[60164]: self.force_reraise() [ 701.536148] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 701.536148] env[60164]: raise self.value [ 701.536148] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 701.536148] env[60164]: updated_port = self._update_port( [ 701.536148] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 701.536148] env[60164]: _ensure_no_port_binding_failure(port) [ 701.536148] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 701.536148] env[60164]: raise exception.PortBindingFailed(port_id=port['id']) [ 701.536985] env[60164]: nova.exception.PortBindingFailed: Binding failed for port 93c5b658-4e25-4747-b1bb-79ac51446057, please check neutron logs for more information. [ 701.536985] env[60164]: Removing descriptor: 18 [ 701.536985] env[60164]: ERROR nova.compute.manager [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 93c5b658-4e25-4747-b1bb-79ac51446057, please check neutron logs for more information. [ 701.536985] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] Traceback (most recent call last): [ 701.536985] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] File "/opt/stack/nova/nova/compute/manager.py", line 2849, in _build_resources [ 701.536985] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] yield resources [ 701.536985] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 701.536985] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] self.driver.spawn(context, instance, image_meta, [ 701.536985] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 701.536985] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] self._vmops.spawn(context, instance, image_meta, injected_files, [ 701.536985] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 701.536985] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] vm_ref = self.build_virtual_machine(instance, [ 701.537353] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 701.537353] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] vif_infos = vmwarevif.get_vif_info(self._session, [ 701.537353] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 701.537353] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] for vif in network_info: [ 701.537353] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 701.537353] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] return self._sync_wrapper(fn, *args, **kwargs) [ 701.537353] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 701.537353] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] self.wait() [ 701.537353] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 701.537353] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] self[:] = self._gt.wait() [ 701.537353] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 701.537353] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] return self._exit_event.wait() [ 701.537353] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 701.537353] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] result = hub.switch() [ 701.537911] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 701.537911] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] return self.greenlet.switch() [ 701.537911] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 701.537911] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] result = function(*args, **kwargs) [ 701.537911] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 701.537911] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] return func(*args, **kwargs) [ 701.537911] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 701.537911] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] raise e [ 701.537911] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 701.537911] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] nwinfo = self.network_api.allocate_for_instance( [ 701.537911] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 701.537911] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] created_port_ids = self._update_ports_for_instance( [ 701.537911] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 701.538771] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] with excutils.save_and_reraise_exception(): [ 701.538771] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 701.538771] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] self.force_reraise() [ 701.538771] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 701.538771] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] raise self.value [ 701.538771] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 701.538771] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] updated_port = self._update_port( [ 701.538771] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 701.538771] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] _ensure_no_port_binding_failure(port) [ 701.538771] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 701.538771] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] raise exception.PortBindingFailed(port_id=port['id']) [ 701.538771] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] nova.exception.PortBindingFailed: Binding failed for port 93c5b658-4e25-4747-b1bb-79ac51446057, please check neutron logs for more information. [ 701.538771] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] [ 701.539480] env[60164]: INFO nova.compute.manager [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] Terminating instance [ 701.539480] env[60164]: DEBUG oslo_concurrency.lockutils [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] Acquiring lock "refresh_cache-5ada08c2-ea12-4b16-9384-af545c8e06aa" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 701.539480] env[60164]: DEBUG oslo_concurrency.lockutils [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] Acquired lock "refresh_cache-5ada08c2-ea12-4b16-9384-af545c8e06aa" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 701.539480] env[60164]: DEBUG nova.network.neutron [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 701.613678] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7db4ebb8-a09e-42e0-9b6a-ab0397fcd493 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 701.617943] env[60164]: DEBUG nova.network.neutron [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 701.626169] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9749742-5998-4484-87e9-5da072e38d7e {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 701.663072] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3cf2a82-d514-4163-9f46-7b49adf54a3a {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 701.671709] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0af0936-4524-4aae-af12-66230a07aa30 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 701.686113] env[60164]: DEBUG nova.compute.provider_tree [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 701.695677] env[60164]: DEBUG nova.scheduler.client.report [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 701.714693] env[60164]: DEBUG oslo_concurrency.lockutils [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.384s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 701.715349] env[60164]: ERROR nova.compute.manager [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port c4b075f2-0c8a-430a-9b5a-9817c1efc2b3, please check neutron logs for more information. [ 701.715349] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] Traceback (most recent call last): [ 701.715349] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 701.715349] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] self.driver.spawn(context, instance, image_meta, [ 701.715349] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 701.715349] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] self._vmops.spawn(context, instance, image_meta, injected_files, [ 701.715349] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 701.715349] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] vm_ref = self.build_virtual_machine(instance, [ 701.715349] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 701.715349] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] vif_infos = vmwarevif.get_vif_info(self._session, [ 701.715349] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 701.715685] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] for vif in network_info: [ 701.715685] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 701.715685] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] return self._sync_wrapper(fn, *args, **kwargs) [ 701.715685] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 701.715685] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] self.wait() [ 701.715685] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 701.715685] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] self[:] = self._gt.wait() [ 701.715685] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 701.715685] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] return self._exit_event.wait() [ 701.715685] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 701.715685] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] result = hub.switch() [ 701.715685] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 701.715685] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] return self.greenlet.switch() [ 701.715685] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 701.716029] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] result = function(*args, **kwargs) [ 701.716029] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 701.716029] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] return func(*args, **kwargs) [ 701.716029] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 701.716029] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] raise e [ 701.716029] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 701.716029] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] nwinfo = self.network_api.allocate_for_instance( [ 701.716029] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 701.716029] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] created_port_ids = self._update_ports_for_instance( [ 701.716029] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 701.716029] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] with excutils.save_and_reraise_exception(): [ 701.716029] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 701.716029] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] self.force_reraise() [ 701.716369] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 701.716369] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] raise self.value [ 701.716369] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 701.716369] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] updated_port = self._update_port( [ 701.716369] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 701.716369] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] _ensure_no_port_binding_failure(port) [ 701.716369] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 701.716369] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] raise exception.PortBindingFailed(port_id=port['id']) [ 701.716369] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] nova.exception.PortBindingFailed: Binding failed for port c4b075f2-0c8a-430a-9b5a-9817c1efc2b3, please check neutron logs for more information. [ 701.716369] env[60164]: ERROR nova.compute.manager [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] [ 701.716369] env[60164]: DEBUG nova.compute.utils [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] Binding failed for port c4b075f2-0c8a-430a-9b5a-9817c1efc2b3, please check neutron logs for more information. {{(pid=60164) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 701.717891] env[60164]: DEBUG nova.compute.manager [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] Build of instance 5980cfa0-bdd6-4fca-a605-c857e0e7b886 was re-scheduled: Binding failed for port c4b075f2-0c8a-430a-9b5a-9817c1efc2b3, please check neutron logs for more information. {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 701.718349] env[60164]: DEBUG nova.compute.manager [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] Unplugging VIFs for instance {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} [ 701.718893] env[60164]: DEBUG oslo_concurrency.lockutils [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] Acquiring lock "refresh_cache-5980cfa0-bdd6-4fca-a605-c857e0e7b886" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 701.718893] env[60164]: DEBUG oslo_concurrency.lockutils [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] Acquired lock "refresh_cache-5980cfa0-bdd6-4fca-a605-c857e0e7b886" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 701.718893] env[60164]: DEBUG nova.network.neutron [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 701.781619] env[60164]: DEBUG nova.network.neutron [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 702.221019] env[60164]: DEBUG nova.network.neutron [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 702.236736] env[60164]: DEBUG oslo_concurrency.lockutils [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] Releasing lock "refresh_cache-5980cfa0-bdd6-4fca-a605-c857e0e7b886" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 702.236969] env[60164]: DEBUG nova.compute.manager [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2984}} [ 702.237156] env[60164]: DEBUG nova.compute.manager [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 702.237314] env[60164]: DEBUG nova.network.neutron [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 702.358445] env[60164]: DEBUG nova.network.neutron [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 702.373378] env[60164]: DEBUG oslo_concurrency.lockutils [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] Releasing lock "refresh_cache-5ada08c2-ea12-4b16-9384-af545c8e06aa" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 702.373829] env[60164]: DEBUG nova.compute.manager [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] Start destroying the instance on the hypervisor. {{(pid=60164) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} [ 702.375156] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] Destroying instance {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 702.375709] env[60164]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-8ebc882b-5228-4fe6-948d-1119c6a3ad73 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 702.390200] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d350c7e-bce3-475c-aeab-f3dafcf9427d {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 702.420777] env[60164]: WARNING nova.virt.vmwareapi.vmops [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 5ada08c2-ea12-4b16-9384-af545c8e06aa could not be found. [ 702.420777] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] Instance destroyed {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 702.420777] env[60164]: INFO nova.compute.manager [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] Took 0.05 seconds to destroy the instance on the hypervisor. [ 702.420777] env[60164]: DEBUG oslo.service.loopingcall [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60164) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 702.420777] env[60164]: DEBUG nova.compute.manager [-] [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 702.421241] env[60164]: DEBUG nova.network.neutron [-] [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 702.492208] env[60164]: DEBUG nova.network.neutron [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 702.495610] env[60164]: DEBUG nova.network.neutron [-] [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 702.502097] env[60164]: DEBUG nova.network.neutron [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 702.507055] env[60164]: DEBUG nova.network.neutron [-] [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 702.519112] env[60164]: INFO nova.compute.manager [-] [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] Took 0.10 seconds to deallocate network for instance. [ 702.524020] env[60164]: INFO nova.compute.manager [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] [instance: 5980cfa0-bdd6-4fca-a605-c857e0e7b886] Took 0.28 seconds to deallocate network for instance. [ 702.524020] env[60164]: DEBUG nova.compute.claims [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] Aborting claim: {{(pid=60164) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 702.524150] env[60164]: DEBUG oslo_concurrency.lockutils [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 702.524294] env[60164]: DEBUG oslo_concurrency.lockutils [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 702.527691] env[60164]: DEBUG nova.network.neutron [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] Successfully created port: dc46a7a3-48c8-41c0-b17a-c53bf30e5f64 {{(pid=60164) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 702.637040] env[60164]: INFO nova.scheduler.client.report [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] Deleted allocations for instance 5980cfa0-bdd6-4fca-a605-c857e0e7b886 [ 702.660962] env[60164]: DEBUG oslo_concurrency.lockutils [None req-6a659fcc-5f6c-43fa-af74-873b3ff2b4a5 tempest-ServerPasswordTestJSON-1791643214 tempest-ServerPasswordTestJSON-1791643214-project-member] Lock "5980cfa0-bdd6-4fca-a605-c857e0e7b886" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 12.558s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 702.684325] env[60164]: DEBUG nova.compute.manager [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] Starting instance... {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} [ 702.743124] env[60164]: DEBUG oslo_concurrency.lockutils [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 702.775021] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20ca45c7-ba95-4485-9658-500354ebf63c {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 702.782179] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4a734fa-bf75-48c3-a38d-0dd24c516041 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 702.824199] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92d41003-6a30-4ee7-9db6-91a45a7a2062 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 702.831870] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5eb151dd-7de4-448f-ac7d-5cc67c0bd9c1 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 702.850316] env[60164]: DEBUG nova.compute.provider_tree [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 702.862169] env[60164]: DEBUG nova.scheduler.client.report [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 702.883546] env[60164]: DEBUG oslo_concurrency.lockutils [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.359s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 702.884176] env[60164]: ERROR nova.compute.manager [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 93c5b658-4e25-4747-b1bb-79ac51446057, please check neutron logs for more information. [ 702.884176] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] Traceback (most recent call last): [ 702.884176] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 702.884176] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] self.driver.spawn(context, instance, image_meta, [ 702.884176] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 702.884176] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] self._vmops.spawn(context, instance, image_meta, injected_files, [ 702.884176] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 702.884176] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] vm_ref = self.build_virtual_machine(instance, [ 702.884176] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 702.884176] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] vif_infos = vmwarevif.get_vif_info(self._session, [ 702.884176] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 702.884465] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] for vif in network_info: [ 702.884465] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 702.884465] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] return self._sync_wrapper(fn, *args, **kwargs) [ 702.884465] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 702.884465] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] self.wait() [ 702.884465] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 702.884465] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] self[:] = self._gt.wait() [ 702.884465] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 702.884465] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] return self._exit_event.wait() [ 702.884465] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 702.884465] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] result = hub.switch() [ 702.884465] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 702.884465] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] return self.greenlet.switch() [ 702.884465] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 702.884788] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] result = function(*args, **kwargs) [ 702.884788] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 702.884788] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] return func(*args, **kwargs) [ 702.884788] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 702.884788] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] raise e [ 702.884788] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 702.884788] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] nwinfo = self.network_api.allocate_for_instance( [ 702.884788] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 702.884788] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] created_port_ids = self._update_ports_for_instance( [ 702.884788] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 702.884788] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] with excutils.save_and_reraise_exception(): [ 702.884788] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 702.884788] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] self.force_reraise() [ 702.885100] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 702.885100] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] raise self.value [ 702.885100] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 702.885100] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] updated_port = self._update_port( [ 702.885100] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 702.885100] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] _ensure_no_port_binding_failure(port) [ 702.885100] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 702.885100] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] raise exception.PortBindingFailed(port_id=port['id']) [ 702.885100] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] nova.exception.PortBindingFailed: Binding failed for port 93c5b658-4e25-4747-b1bb-79ac51446057, please check neutron logs for more information. [ 702.885100] env[60164]: ERROR nova.compute.manager [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] [ 702.885100] env[60164]: DEBUG nova.compute.utils [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] Binding failed for port 93c5b658-4e25-4747-b1bb-79ac51446057, please check neutron logs for more information. {{(pid=60164) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 702.886923] env[60164]: DEBUG oslo_concurrency.lockutils [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.144s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 702.889512] env[60164]: INFO nova.compute.claims [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 702.892800] env[60164]: DEBUG nova.compute.manager [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] Build of instance 5ada08c2-ea12-4b16-9384-af545c8e06aa was re-scheduled: Binding failed for port 93c5b658-4e25-4747-b1bb-79ac51446057, please check neutron logs for more information. {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 702.893284] env[60164]: DEBUG nova.compute.manager [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] Unplugging VIFs for instance {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} [ 702.893509] env[60164]: DEBUG oslo_concurrency.lockutils [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] Acquiring lock "refresh_cache-5ada08c2-ea12-4b16-9384-af545c8e06aa" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 702.893654] env[60164]: DEBUG oslo_concurrency.lockutils [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] Acquired lock "refresh_cache-5ada08c2-ea12-4b16-9384-af545c8e06aa" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 702.893808] env[60164]: DEBUG nova.network.neutron [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 702.952862] env[60164]: DEBUG nova.network.neutron [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 703.198619] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15008aa5-392f-4827-87b9-f03e19b4bd70 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 703.208620] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4b973ac-e161-487c-bcd2-e23d3df59a8f {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 703.240540] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5625517c-cd44-4b05-b014-645608144488 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 703.248158] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d56c537e-05e6-49eb-8028-e81ca52438bd {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 703.262149] env[60164]: DEBUG nova.compute.provider_tree [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 703.272231] env[60164]: DEBUG nova.scheduler.client.report [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 703.293032] env[60164]: DEBUG oslo_concurrency.lockutils [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.406s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 703.293542] env[60164]: DEBUG nova.compute.manager [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] Start building networks asynchronously for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} [ 703.334395] env[60164]: DEBUG oslo_concurrency.lockutils [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Acquiring lock "c7cb800a-3634-44e4-bb18-fab9d2e86c7e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 703.334582] env[60164]: DEBUG oslo_concurrency.lockutils [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Lock "c7cb800a-3634-44e4-bb18-fab9d2e86c7e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 703.346844] env[60164]: DEBUG nova.compute.utils [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] Using /dev/sd instead of None {{(pid=60164) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 703.350717] env[60164]: DEBUG nova.compute.manager [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] Allocating IP information in the background. {{(pid=60164) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 703.350717] env[60164]: DEBUG nova.network.neutron [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] allocate_for_instance() {{(pid=60164) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1143}} [ 703.371382] env[60164]: DEBUG oslo_concurrency.lockutils [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Acquiring lock "8fcf260d-2796-4972-b217-95954e309a6e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 703.371619] env[60164]: DEBUG oslo_concurrency.lockutils [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Lock "8fcf260d-2796-4972-b217-95954e309a6e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 703.372141] env[60164]: DEBUG nova.compute.manager [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] Start building block device mappings for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} [ 703.444737] env[60164]: DEBUG nova.compute.manager [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] Start spawning the instance on the hypervisor. {{(pid=60164) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} [ 703.469517] env[60164]: DEBUG nova.virt.hardware [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-15T19:01:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-15T19:01:11Z,direct_url=,disk_format='vmdk',id=1618eb55-f00d-42a5-b978-e81e57855fb4,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='4c19b57a5b764a6480c1f6e2a9cdd5e4',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-15T19:01:12Z,virtual_size=,visibility=), allow threads: False {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 703.469764] env[60164]: DEBUG nova.virt.hardware [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] Flavor limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 703.469917] env[60164]: DEBUG nova.virt.hardware [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] Image limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 703.471567] env[60164]: DEBUG nova.virt.hardware [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] Flavor pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 703.471567] env[60164]: DEBUG nova.virt.hardware [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] Image pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 703.471567] env[60164]: DEBUG nova.virt.hardware [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 703.471567] env[60164]: DEBUG nova.virt.hardware [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 703.471567] env[60164]: DEBUG nova.virt.hardware [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 703.471843] env[60164]: DEBUG nova.virt.hardware [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] Got 1 possible topologies {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 703.471843] env[60164]: DEBUG nova.virt.hardware [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 703.471843] env[60164]: DEBUG nova.virt.hardware [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 703.472203] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd9d209f-c00b-42bd-8431-219d2ce0ea54 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 703.484852] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1817068-ee3d-4357-b1ab-e349fc88abf2 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 703.565892] env[60164]: DEBUG nova.network.neutron [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 703.583271] env[60164]: DEBUG oslo_concurrency.lockutils [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] Releasing lock "refresh_cache-5ada08c2-ea12-4b16-9384-af545c8e06aa" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 703.583271] env[60164]: DEBUG nova.compute.manager [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2984}} [ 703.583271] env[60164]: DEBUG nova.compute.manager [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 703.583271] env[60164]: DEBUG nova.network.neutron [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 703.636439] env[60164]: DEBUG nova.network.neutron [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 703.647273] env[60164]: DEBUG nova.network.neutron [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 703.660200] env[60164]: INFO nova.compute.manager [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] [instance: 5ada08c2-ea12-4b16-9384-af545c8e06aa] Took 0.08 seconds to deallocate network for instance. [ 703.707873] env[60164]: DEBUG nova.policy [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4bf93c97450a44e5bf91e32b218736d2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b1fece12693747c2b6e56f92f305881f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60164) authorize /opt/stack/nova/nova/policy.py:203}} [ 703.752891] env[60164]: INFO nova.scheduler.client.report [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] Deleted allocations for instance 5ada08c2-ea12-4b16-9384-af545c8e06aa [ 703.777070] env[60164]: DEBUG oslo_concurrency.lockutils [None req-73bbd370-d330-49ce-9037-38c4e2ad9cc4 tempest-ServerMetadataTestJSON-855428705 tempest-ServerMetadataTestJSON-855428705-project-member] Lock "5ada08c2-ea12-4b16-9384-af545c8e06aa" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 12.361s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 703.799509] env[60164]: DEBUG nova.compute.manager [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] Starting instance... {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} [ 703.848804] env[60164]: DEBUG oslo_concurrency.lockutils [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 703.849351] env[60164]: DEBUG oslo_concurrency.lockutils [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 703.850835] env[60164]: INFO nova.compute.claims [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 703.981683] env[60164]: ERROR nova.compute.manager [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port e641e291-cbd8-404d-ab1d-4dee8d7969cd, please check neutron logs for more information. [ 703.981683] env[60164]: ERROR nova.compute.manager Traceback (most recent call last): [ 703.981683] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 703.981683] env[60164]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 703.981683] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 703.981683] env[60164]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 703.981683] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 703.981683] env[60164]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 703.981683] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 703.981683] env[60164]: ERROR nova.compute.manager self.force_reraise() [ 703.981683] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 703.981683] env[60164]: ERROR nova.compute.manager raise self.value [ 703.981683] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 703.981683] env[60164]: ERROR nova.compute.manager updated_port = self._update_port( [ 703.981683] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 703.981683] env[60164]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 703.982388] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 703.982388] env[60164]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 703.982388] env[60164]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port e641e291-cbd8-404d-ab1d-4dee8d7969cd, please check neutron logs for more information. [ 703.982388] env[60164]: ERROR nova.compute.manager [ 703.982388] env[60164]: Traceback (most recent call last): [ 703.982388] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 703.982388] env[60164]: listener.cb(fileno) [ 703.982388] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 703.982388] env[60164]: result = function(*args, **kwargs) [ 703.982388] env[60164]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 703.982388] env[60164]: return func(*args, **kwargs) [ 703.982388] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 703.982388] env[60164]: raise e [ 703.982388] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 703.982388] env[60164]: nwinfo = self.network_api.allocate_for_instance( [ 703.982388] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 703.982388] env[60164]: created_port_ids = self._update_ports_for_instance( [ 703.982388] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 703.982388] env[60164]: with excutils.save_and_reraise_exception(): [ 703.982388] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 703.982388] env[60164]: self.force_reraise() [ 703.982388] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 703.982388] env[60164]: raise self.value [ 703.982388] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 703.982388] env[60164]: updated_port = self._update_port( [ 703.982388] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 703.982388] env[60164]: _ensure_no_port_binding_failure(port) [ 703.982388] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 703.982388] env[60164]: raise exception.PortBindingFailed(port_id=port['id']) [ 703.983065] env[60164]: nova.exception.PortBindingFailed: Binding failed for port e641e291-cbd8-404d-ab1d-4dee8d7969cd, please check neutron logs for more information. [ 703.983065] env[60164]: Removing descriptor: 17 [ 703.984518] env[60164]: ERROR nova.compute.manager [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] [instance: 7e356a2e-b299-4801-af74-f536a12489fc] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port e641e291-cbd8-404d-ab1d-4dee8d7969cd, please check neutron logs for more information. [ 703.984518] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] Traceback (most recent call last): [ 703.984518] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] File "/opt/stack/nova/nova/compute/manager.py", line 2849, in _build_resources [ 703.984518] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] yield resources [ 703.984518] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 703.984518] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] self.driver.spawn(context, instance, image_meta, [ 703.984518] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 703.984518] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] self._vmops.spawn(context, instance, image_meta, injected_files, [ 703.984518] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 703.984518] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] vm_ref = self.build_virtual_machine(instance, [ 703.984518] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 703.984884] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] vif_infos = vmwarevif.get_vif_info(self._session, [ 703.984884] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 703.984884] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] for vif in network_info: [ 703.984884] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 703.984884] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] return self._sync_wrapper(fn, *args, **kwargs) [ 703.984884] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 703.984884] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] self.wait() [ 703.984884] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 703.984884] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] self[:] = self._gt.wait() [ 703.984884] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 703.984884] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] return self._exit_event.wait() [ 703.984884] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 703.984884] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] result = hub.switch() [ 703.984884] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 703.985707] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] return self.greenlet.switch() [ 703.985707] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 703.985707] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] result = function(*args, **kwargs) [ 703.985707] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 703.985707] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] return func(*args, **kwargs) [ 703.985707] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 703.985707] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] raise e [ 703.985707] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 703.985707] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] nwinfo = self.network_api.allocate_for_instance( [ 703.985707] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 703.985707] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] created_port_ids = self._update_ports_for_instance( [ 703.985707] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 703.985707] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] with excutils.save_and_reraise_exception(): [ 703.986055] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 703.986055] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] self.force_reraise() [ 703.986055] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 703.986055] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] raise self.value [ 703.986055] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 703.986055] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] updated_port = self._update_port( [ 703.986055] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 703.986055] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] _ensure_no_port_binding_failure(port) [ 703.986055] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 703.986055] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] raise exception.PortBindingFailed(port_id=port['id']) [ 703.986055] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] nova.exception.PortBindingFailed: Binding failed for port e641e291-cbd8-404d-ab1d-4dee8d7969cd, please check neutron logs for more information. [ 703.986055] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] [ 703.986387] env[60164]: INFO nova.compute.manager [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] [instance: 7e356a2e-b299-4801-af74-f536a12489fc] Terminating instance [ 703.987543] env[60164]: DEBUG oslo_concurrency.lockutils [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] Acquiring lock "refresh_cache-7e356a2e-b299-4801-af74-f536a12489fc" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 703.987543] env[60164]: DEBUG oslo_concurrency.lockutils [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] Acquired lock "refresh_cache-7e356a2e-b299-4801-af74-f536a12489fc" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 703.987931] env[60164]: DEBUG nova.network.neutron [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] [instance: 7e356a2e-b299-4801-af74-f536a12489fc] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 704.077892] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17b70470-26f2-4698-8f9a-1b275f38e178 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.087236] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b9dfeda-a46e-492e-beec-59b928f0f321 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.119904] env[60164]: DEBUG nova.network.neutron [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] [instance: 7e356a2e-b299-4801-af74-f536a12489fc] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 704.129348] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7e25df0-bc3a-49c2-bc2d-18101dee9a00 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.135432] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0bb67daf-12f0-4623-ba35-eb4de7fe2640 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.150221] env[60164]: DEBUG nova.compute.provider_tree [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 704.159942] env[60164]: DEBUG nova.scheduler.client.report [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 704.178323] env[60164]: DEBUG oslo_concurrency.lockutils [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.327s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 704.178323] env[60164]: DEBUG nova.compute.manager [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] Start building networks asynchronously for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} [ 704.218447] env[60164]: DEBUG nova.compute.utils [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Using /dev/sd instead of None {{(pid=60164) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 704.219881] env[60164]: DEBUG nova.compute.manager [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] Allocating IP information in the background. {{(pid=60164) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 704.219881] env[60164]: DEBUG nova.network.neutron [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] allocate_for_instance() {{(pid=60164) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1143}} [ 704.231601] env[60164]: DEBUG nova.compute.manager [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] Start building block device mappings for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} [ 704.297836] env[60164]: DEBUG oslo_concurrency.lockutils [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Acquiring lock "c9c2d371-978e-4037-ba78-9b44f40765bd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 704.297836] env[60164]: DEBUG oslo_concurrency.lockutils [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Lock "c9c2d371-978e-4037-ba78-9b44f40765bd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 704.307678] env[60164]: DEBUG nova.compute.manager [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] Start spawning the instance on the hypervisor. {{(pid=60164) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} [ 704.332466] env[60164]: DEBUG nova.virt.hardware [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-15T19:01:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-15T19:01:11Z,direct_url=,disk_format='vmdk',id=1618eb55-f00d-42a5-b978-e81e57855fb4,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='4c19b57a5b764a6480c1f6e2a9cdd5e4',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-15T19:01:12Z,virtual_size=,visibility=), allow threads: False {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 704.332466] env[60164]: DEBUG nova.virt.hardware [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Flavor limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 704.332610] env[60164]: DEBUG nova.virt.hardware [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Image limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 704.333278] env[60164]: DEBUG nova.virt.hardware [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Flavor pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 704.333278] env[60164]: DEBUG nova.virt.hardware [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Image pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 704.333278] env[60164]: DEBUG nova.virt.hardware [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 704.333278] env[60164]: DEBUG nova.virt.hardware [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 704.333463] env[60164]: DEBUG nova.virt.hardware [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 704.333562] env[60164]: DEBUG nova.virt.hardware [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Got 1 possible topologies {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 704.333971] env[60164]: DEBUG nova.virt.hardware [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 704.334683] env[60164]: DEBUG nova.virt.hardware [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 704.335460] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58f27b81-9842-42ca-9c85-ecbcdd78a750 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.343808] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ecffcf1-905f-41ee-8a55-8e3f2f68e654 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.737229] env[60164]: DEBUG nova.policy [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4cc2b0ed84534852a16f9fdd4a8977f7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a98b1fd8031545e381db0682e508fc18', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60164) authorize /opt/stack/nova/nova/policy.py:203}} [ 704.857683] env[60164]: DEBUG nova.network.neutron [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] [instance: 7e356a2e-b299-4801-af74-f536a12489fc] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 704.867898] env[60164]: DEBUG oslo_concurrency.lockutils [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] Releasing lock "refresh_cache-7e356a2e-b299-4801-af74-f536a12489fc" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 704.868487] env[60164]: DEBUG nova.compute.manager [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] [instance: 7e356a2e-b299-4801-af74-f536a12489fc] Start destroying the instance on the hypervisor. {{(pid=60164) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} [ 704.868528] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] [instance: 7e356a2e-b299-4801-af74-f536a12489fc] Destroying instance {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 704.869038] env[60164]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-74f8afe3-e518-4f5d-89b8-f0a36bb60497 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.883303] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2fcfddc1-86e9-4713-9efb-62aac2e81891 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.913192] env[60164]: WARNING nova.virt.vmwareapi.vmops [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] [instance: 7e356a2e-b299-4801-af74-f536a12489fc] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 7e356a2e-b299-4801-af74-f536a12489fc could not be found. [ 704.913192] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] [instance: 7e356a2e-b299-4801-af74-f536a12489fc] Instance destroyed {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 704.913192] env[60164]: INFO nova.compute.manager [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] [instance: 7e356a2e-b299-4801-af74-f536a12489fc] Took 0.04 seconds to destroy the instance on the hypervisor. [ 704.913386] env[60164]: DEBUG oslo.service.loopingcall [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60164) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 704.913506] env[60164]: DEBUG nova.compute.manager [-] [instance: 7e356a2e-b299-4801-af74-f536a12489fc] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 704.913599] env[60164]: DEBUG nova.network.neutron [-] [instance: 7e356a2e-b299-4801-af74-f536a12489fc] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 704.978313] env[60164]: DEBUG nova.network.neutron [-] [instance: 7e356a2e-b299-4801-af74-f536a12489fc] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 704.988738] env[60164]: DEBUG nova.network.neutron [-] [instance: 7e356a2e-b299-4801-af74-f536a12489fc] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 704.997879] env[60164]: INFO nova.compute.manager [-] [instance: 7e356a2e-b299-4801-af74-f536a12489fc] Took 0.08 seconds to deallocate network for instance. [ 704.999870] env[60164]: DEBUG nova.compute.claims [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] [instance: 7e356a2e-b299-4801-af74-f536a12489fc] Aborting claim: {{(pid=60164) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 705.000057] env[60164]: DEBUG oslo_concurrency.lockutils [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 705.000271] env[60164]: DEBUG oslo_concurrency.lockutils [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 705.352427] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-589e216d-7320-4ad8-a093-7b8d0cf678df {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 705.361012] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5f9128d-e291-4350-820d-7c89313e9016 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 705.401709] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7fc4d1c-25f7-46bd-955b-2bf51305dbde {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 705.409398] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e746ca0f-c664-4666-9924-8a7b7d9e1e5b {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 705.423245] env[60164]: DEBUG nova.compute.provider_tree [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 705.434211] env[60164]: DEBUG nova.scheduler.client.report [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 705.463331] env[60164]: DEBUG oslo_concurrency.lockutils [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.463s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 705.463843] env[60164]: ERROR nova.compute.manager [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] [instance: 7e356a2e-b299-4801-af74-f536a12489fc] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port e641e291-cbd8-404d-ab1d-4dee8d7969cd, please check neutron logs for more information. [ 705.463843] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] Traceback (most recent call last): [ 705.463843] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 705.463843] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] self.driver.spawn(context, instance, image_meta, [ 705.463843] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 705.463843] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] self._vmops.spawn(context, instance, image_meta, injected_files, [ 705.463843] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 705.463843] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] vm_ref = self.build_virtual_machine(instance, [ 705.463843] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 705.463843] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] vif_infos = vmwarevif.get_vif_info(self._session, [ 705.463843] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 705.464225] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] for vif in network_info: [ 705.464225] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 705.464225] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] return self._sync_wrapper(fn, *args, **kwargs) [ 705.464225] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 705.464225] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] self.wait() [ 705.464225] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 705.464225] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] self[:] = self._gt.wait() [ 705.464225] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 705.464225] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] return self._exit_event.wait() [ 705.464225] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 705.464225] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] result = hub.switch() [ 705.464225] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 705.464225] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] return self.greenlet.switch() [ 705.464225] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 705.464690] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] result = function(*args, **kwargs) [ 705.464690] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 705.464690] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] return func(*args, **kwargs) [ 705.464690] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 705.464690] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] raise e [ 705.464690] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 705.464690] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] nwinfo = self.network_api.allocate_for_instance( [ 705.464690] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 705.464690] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] created_port_ids = self._update_ports_for_instance( [ 705.464690] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 705.464690] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] with excutils.save_and_reraise_exception(): [ 705.464690] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 705.464690] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] self.force_reraise() [ 705.465092] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 705.465092] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] raise self.value [ 705.465092] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 705.465092] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] updated_port = self._update_port( [ 705.465092] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 705.465092] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] _ensure_no_port_binding_failure(port) [ 705.465092] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 705.465092] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] raise exception.PortBindingFailed(port_id=port['id']) [ 705.465092] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] nova.exception.PortBindingFailed: Binding failed for port e641e291-cbd8-404d-ab1d-4dee8d7969cd, please check neutron logs for more information. [ 705.465092] env[60164]: ERROR nova.compute.manager [instance: 7e356a2e-b299-4801-af74-f536a12489fc] [ 705.465519] env[60164]: DEBUG nova.compute.utils [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] [instance: 7e356a2e-b299-4801-af74-f536a12489fc] Binding failed for port e641e291-cbd8-404d-ab1d-4dee8d7969cd, please check neutron logs for more information. {{(pid=60164) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 705.471408] env[60164]: DEBUG nova.compute.manager [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] [instance: 7e356a2e-b299-4801-af74-f536a12489fc] Build of instance 7e356a2e-b299-4801-af74-f536a12489fc was re-scheduled: Binding failed for port e641e291-cbd8-404d-ab1d-4dee8d7969cd, please check neutron logs for more information. {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 705.471408] env[60164]: DEBUG nova.compute.manager [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] [instance: 7e356a2e-b299-4801-af74-f536a12489fc] Unplugging VIFs for instance {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} [ 705.471408] env[60164]: DEBUG oslo_concurrency.lockutils [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] Acquiring lock "refresh_cache-7e356a2e-b299-4801-af74-f536a12489fc" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 705.471408] env[60164]: DEBUG oslo_concurrency.lockutils [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] Acquired lock "refresh_cache-7e356a2e-b299-4801-af74-f536a12489fc" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 705.471624] env[60164]: DEBUG nova.network.neutron [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] [instance: 7e356a2e-b299-4801-af74-f536a12489fc] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 705.505424] env[60164]: DEBUG oslo_concurrency.lockutils [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Acquiring lock "156cf534-81ca-4cc6-9b0d-2d245016c53c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 705.505647] env[60164]: DEBUG oslo_concurrency.lockutils [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Lock "156cf534-81ca-4cc6-9b0d-2d245016c53c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 705.534497] env[60164]: DEBUG nova.network.neutron [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] Successfully created port: 9c7bd52d-273d-4d17-8e75-b836df862857 {{(pid=60164) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 705.596590] env[60164]: DEBUG nova.network.neutron [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] [instance: 7e356a2e-b299-4801-af74-f536a12489fc] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 705.765490] env[60164]: ERROR nova.compute.manager [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 53857d03-68f5-47d9-b7d9-b532a6d42fcf, please check neutron logs for more information. [ 705.765490] env[60164]: ERROR nova.compute.manager Traceback (most recent call last): [ 705.765490] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 705.765490] env[60164]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 705.765490] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 705.765490] env[60164]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 705.765490] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 705.765490] env[60164]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 705.765490] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 705.765490] env[60164]: ERROR nova.compute.manager self.force_reraise() [ 705.765490] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 705.765490] env[60164]: ERROR nova.compute.manager raise self.value [ 705.765490] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 705.765490] env[60164]: ERROR nova.compute.manager updated_port = self._update_port( [ 705.765490] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 705.765490] env[60164]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 705.766132] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 705.766132] env[60164]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 705.766132] env[60164]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 53857d03-68f5-47d9-b7d9-b532a6d42fcf, please check neutron logs for more information. [ 705.766132] env[60164]: ERROR nova.compute.manager [ 705.766132] env[60164]: Traceback (most recent call last): [ 705.766132] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 705.766132] env[60164]: listener.cb(fileno) [ 705.766132] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 705.766132] env[60164]: result = function(*args, **kwargs) [ 705.766132] env[60164]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 705.766132] env[60164]: return func(*args, **kwargs) [ 705.766132] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 705.766132] env[60164]: raise e [ 705.766132] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 705.766132] env[60164]: nwinfo = self.network_api.allocate_for_instance( [ 705.766132] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 705.766132] env[60164]: created_port_ids = self._update_ports_for_instance( [ 705.766132] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 705.766132] env[60164]: with excutils.save_and_reraise_exception(): [ 705.766132] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 705.766132] env[60164]: self.force_reraise() [ 705.766132] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 705.766132] env[60164]: raise self.value [ 705.766132] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 705.766132] env[60164]: updated_port = self._update_port( [ 705.766132] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 705.766132] env[60164]: _ensure_no_port_binding_failure(port) [ 705.766132] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 705.766132] env[60164]: raise exception.PortBindingFailed(port_id=port['id']) [ 705.766869] env[60164]: nova.exception.PortBindingFailed: Binding failed for port 53857d03-68f5-47d9-b7d9-b532a6d42fcf, please check neutron logs for more information. [ 705.766869] env[60164]: Removing descriptor: 19 [ 705.766869] env[60164]: ERROR nova.compute.manager [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: e52a3adf-4654-43cd-8613-749277053ea8] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 53857d03-68f5-47d9-b7d9-b532a6d42fcf, please check neutron logs for more information. [ 705.766869] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] Traceback (most recent call last): [ 705.766869] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] File "/opt/stack/nova/nova/compute/manager.py", line 2849, in _build_resources [ 705.766869] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] yield resources [ 705.766869] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 705.766869] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] self.driver.spawn(context, instance, image_meta, [ 705.766869] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 705.766869] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 705.766869] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 705.766869] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] vm_ref = self.build_virtual_machine(instance, [ 705.767168] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 705.767168] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] vif_infos = vmwarevif.get_vif_info(self._session, [ 705.767168] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 705.767168] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] for vif in network_info: [ 705.767168] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 705.767168] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] return self._sync_wrapper(fn, *args, **kwargs) [ 705.767168] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 705.767168] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] self.wait() [ 705.767168] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 705.767168] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] self[:] = self._gt.wait() [ 705.767168] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 705.767168] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] return self._exit_event.wait() [ 705.767168] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 705.767168] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] result = hub.switch() [ 705.767517] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 705.767517] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] return self.greenlet.switch() [ 705.767517] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 705.767517] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] result = function(*args, **kwargs) [ 705.767517] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 705.767517] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] return func(*args, **kwargs) [ 705.767517] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 705.767517] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] raise e [ 705.767517] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 705.767517] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] nwinfo = self.network_api.allocate_for_instance( [ 705.767517] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 705.767517] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] created_port_ids = self._update_ports_for_instance( [ 705.767517] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 705.767864] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] with excutils.save_and_reraise_exception(): [ 705.767864] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 705.767864] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] self.force_reraise() [ 705.767864] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 705.767864] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] raise self.value [ 705.767864] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 705.767864] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] updated_port = self._update_port( [ 705.767864] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 705.767864] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] _ensure_no_port_binding_failure(port) [ 705.767864] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 705.767864] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] raise exception.PortBindingFailed(port_id=port['id']) [ 705.767864] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] nova.exception.PortBindingFailed: Binding failed for port 53857d03-68f5-47d9-b7d9-b532a6d42fcf, please check neutron logs for more information. [ 705.767864] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] [ 705.768193] env[60164]: INFO nova.compute.manager [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: e52a3adf-4654-43cd-8613-749277053ea8] Terminating instance [ 705.768193] env[60164]: DEBUG oslo_concurrency.lockutils [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Acquiring lock "refresh_cache-e52a3adf-4654-43cd-8613-749277053ea8" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 705.768193] env[60164]: DEBUG oslo_concurrency.lockutils [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Acquired lock "refresh_cache-e52a3adf-4654-43cd-8613-749277053ea8" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 705.768193] env[60164]: DEBUG nova.network.neutron [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: e52a3adf-4654-43cd-8613-749277053ea8] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 706.013626] env[60164]: DEBUG nova.network.neutron [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: e52a3adf-4654-43cd-8613-749277053ea8] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 706.416297] env[60164]: ERROR nova.compute.manager [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 05e585fb-54d8-4a6a-b92f-a1b020c55e21, please check neutron logs for more information. [ 706.416297] env[60164]: ERROR nova.compute.manager Traceback (most recent call last): [ 706.416297] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 706.416297] env[60164]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 706.416297] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 706.416297] env[60164]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 706.416297] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 706.416297] env[60164]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 706.416297] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 706.416297] env[60164]: ERROR nova.compute.manager self.force_reraise() [ 706.416297] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 706.416297] env[60164]: ERROR nova.compute.manager raise self.value [ 706.416297] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 706.416297] env[60164]: ERROR nova.compute.manager updated_port = self._update_port( [ 706.416297] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 706.416297] env[60164]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 706.416992] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 706.416992] env[60164]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 706.416992] env[60164]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 05e585fb-54d8-4a6a-b92f-a1b020c55e21, please check neutron logs for more information. [ 706.416992] env[60164]: ERROR nova.compute.manager [ 706.416992] env[60164]: Traceback (most recent call last): [ 706.416992] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 706.416992] env[60164]: listener.cb(fileno) [ 706.416992] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 706.416992] env[60164]: result = function(*args, **kwargs) [ 706.416992] env[60164]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 706.416992] env[60164]: return func(*args, **kwargs) [ 706.416992] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 706.416992] env[60164]: raise e [ 706.416992] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 706.416992] env[60164]: nwinfo = self.network_api.allocate_for_instance( [ 706.416992] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 706.416992] env[60164]: created_port_ids = self._update_ports_for_instance( [ 706.416992] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 706.416992] env[60164]: with excutils.save_and_reraise_exception(): [ 706.416992] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 706.416992] env[60164]: self.force_reraise() [ 706.416992] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 706.416992] env[60164]: raise self.value [ 706.416992] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 706.416992] env[60164]: updated_port = self._update_port( [ 706.416992] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 706.416992] env[60164]: _ensure_no_port_binding_failure(port) [ 706.416992] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 706.416992] env[60164]: raise exception.PortBindingFailed(port_id=port['id']) [ 706.417861] env[60164]: nova.exception.PortBindingFailed: Binding failed for port 05e585fb-54d8-4a6a-b92f-a1b020c55e21, please check neutron logs for more information. [ 706.417861] env[60164]: Removing descriptor: 12 [ 706.417861] env[60164]: ERROR nova.compute.manager [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 05e585fb-54d8-4a6a-b92f-a1b020c55e21, please check neutron logs for more information. [ 706.417861] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] Traceback (most recent call last): [ 706.417861] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] File "/opt/stack/nova/nova/compute/manager.py", line 2849, in _build_resources [ 706.417861] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] yield resources [ 706.417861] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 706.417861] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] self.driver.spawn(context, instance, image_meta, [ 706.417861] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 706.417861] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 706.417861] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 706.417861] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] vm_ref = self.build_virtual_machine(instance, [ 706.418230] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 706.418230] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] vif_infos = vmwarevif.get_vif_info(self._session, [ 706.418230] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 706.418230] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] for vif in network_info: [ 706.418230] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 706.418230] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] return self._sync_wrapper(fn, *args, **kwargs) [ 706.418230] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 706.418230] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] self.wait() [ 706.418230] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 706.418230] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] self[:] = self._gt.wait() [ 706.418230] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 706.418230] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] return self._exit_event.wait() [ 706.418230] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 706.418230] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] result = hub.switch() [ 706.418588] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 706.418588] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] return self.greenlet.switch() [ 706.418588] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 706.418588] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] result = function(*args, **kwargs) [ 706.418588] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 706.418588] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] return func(*args, **kwargs) [ 706.418588] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 706.418588] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] raise e [ 706.418588] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 706.418588] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] nwinfo = self.network_api.allocate_for_instance( [ 706.418588] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 706.418588] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] created_port_ids = self._update_ports_for_instance( [ 706.418588] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 706.418937] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] with excutils.save_and_reraise_exception(): [ 706.418937] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 706.418937] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] self.force_reraise() [ 706.418937] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 706.418937] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] raise self.value [ 706.418937] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 706.418937] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] updated_port = self._update_port( [ 706.418937] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 706.418937] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] _ensure_no_port_binding_failure(port) [ 706.418937] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 706.418937] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] raise exception.PortBindingFailed(port_id=port['id']) [ 706.418937] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] nova.exception.PortBindingFailed: Binding failed for port 05e585fb-54d8-4a6a-b92f-a1b020c55e21, please check neutron logs for more information. [ 706.418937] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] [ 706.421466] env[60164]: INFO nova.compute.manager [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] Terminating instance [ 706.422418] env[60164]: DEBUG oslo_concurrency.lockutils [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] Acquiring lock "refresh_cache-ef7b219e-437d-4b15-b559-ca5e2405efb2" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 706.422797] env[60164]: DEBUG oslo_concurrency.lockutils [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] Acquired lock "refresh_cache-ef7b219e-437d-4b15-b559-ca5e2405efb2" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 706.423185] env[60164]: DEBUG nova.network.neutron [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 706.440134] env[60164]: DEBUG nova.network.neutron [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: e52a3adf-4654-43cd-8613-749277053ea8] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 706.447386] env[60164]: DEBUG nova.network.neutron [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] [instance: 7e356a2e-b299-4801-af74-f536a12489fc] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 706.454983] env[60164]: DEBUG oslo_concurrency.lockutils [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Releasing lock "refresh_cache-e52a3adf-4654-43cd-8613-749277053ea8" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 706.454983] env[60164]: DEBUG nova.compute.manager [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: e52a3adf-4654-43cd-8613-749277053ea8] Start destroying the instance on the hypervisor. {{(pid=60164) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} [ 706.454983] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: e52a3adf-4654-43cd-8613-749277053ea8] Destroying instance {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 706.454983] env[60164]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-2e3cce6a-f2f5-43aa-839a-e72583466450 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.461125] env[60164]: DEBUG oslo_concurrency.lockutils [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] Releasing lock "refresh_cache-7e356a2e-b299-4801-af74-f536a12489fc" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 706.461125] env[60164]: DEBUG nova.compute.manager [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2984}} [ 706.461125] env[60164]: DEBUG nova.compute.manager [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] [instance: 7e356a2e-b299-4801-af74-f536a12489fc] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 706.461125] env[60164]: DEBUG nova.network.neutron [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] [instance: 7e356a2e-b299-4801-af74-f536a12489fc] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 706.466957] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a749959-cad9-4915-936f-68eba85778a9 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.480729] env[60164]: DEBUG nova.network.neutron [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 706.499429] env[60164]: WARNING nova.virt.vmwareapi.vmops [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: e52a3adf-4654-43cd-8613-749277053ea8] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance e52a3adf-4654-43cd-8613-749277053ea8 could not be found. [ 706.500354] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: e52a3adf-4654-43cd-8613-749277053ea8] Instance destroyed {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 706.500354] env[60164]: INFO nova.compute.manager [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: e52a3adf-4654-43cd-8613-749277053ea8] Took 0.05 seconds to destroy the instance on the hypervisor. [ 706.500354] env[60164]: DEBUG oslo.service.loopingcall [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60164) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 706.500729] env[60164]: DEBUG nova.compute.manager [-] [instance: e52a3adf-4654-43cd-8613-749277053ea8] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 706.500914] env[60164]: DEBUG nova.network.neutron [-] [instance: e52a3adf-4654-43cd-8613-749277053ea8] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 706.556956] env[60164]: DEBUG nova.network.neutron [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] [instance: 7e356a2e-b299-4801-af74-f536a12489fc] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 706.559734] env[60164]: DEBUG nova.network.neutron [-] [instance: e52a3adf-4654-43cd-8613-749277053ea8] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 706.567470] env[60164]: DEBUG nova.network.neutron [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] [instance: 7e356a2e-b299-4801-af74-f536a12489fc] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 706.577146] env[60164]: DEBUG nova.network.neutron [-] [instance: e52a3adf-4654-43cd-8613-749277053ea8] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 706.589577] env[60164]: INFO nova.compute.manager [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] [instance: 7e356a2e-b299-4801-af74-f536a12489fc] Took 0.13 seconds to deallocate network for instance. [ 706.596092] env[60164]: INFO nova.compute.manager [-] [instance: e52a3adf-4654-43cd-8613-749277053ea8] Took 0.09 seconds to deallocate network for instance. [ 706.598040] env[60164]: DEBUG nova.compute.claims [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: e52a3adf-4654-43cd-8613-749277053ea8] Aborting claim: {{(pid=60164) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 706.598217] env[60164]: DEBUG oslo_concurrency.lockutils [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 706.598424] env[60164]: DEBUG oslo_concurrency.lockutils [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 706.720527] env[60164]: INFO nova.scheduler.client.report [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] Deleted allocations for instance 7e356a2e-b299-4801-af74-f536a12489fc [ 706.757968] env[60164]: DEBUG oslo_concurrency.lockutils [None req-24b40525-bc39-4c67-8fb1-5b1623397163 tempest-ServersV294TestFqdnHostnames-1586253388 tempest-ServersV294TestFqdnHostnames-1586253388-project-member] Lock "7e356a2e-b299-4801-af74-f536a12489fc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 13.244s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 706.785026] env[60164]: DEBUG nova.compute.manager [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 8fcf260d-2796-4972-b217-95954e309a6e] Starting instance... {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} [ 706.795119] env[60164]: DEBUG nova.network.neutron [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] Successfully created port: d075e52f-7ad7-43b6-8bcc-5b50c7a4ec75 {{(pid=60164) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 706.849829] env[60164]: DEBUG oslo_concurrency.lockutils [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 706.907026] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cba748d6-202c-4763-8213-a138fbd32631 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.912811] env[60164]: DEBUG nova.network.neutron [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 706.917449] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de5dc4e5-fdab-4448-96be-de248283fcb3 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.955687] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-205d9e18-c6c3-4859-97c6-3569be220c9e {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.957777] env[60164]: DEBUG oslo_concurrency.lockutils [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] Releasing lock "refresh_cache-ef7b219e-437d-4b15-b559-ca5e2405efb2" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 706.958166] env[60164]: DEBUG nova.compute.manager [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] Start destroying the instance on the hypervisor. {{(pid=60164) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} [ 706.958358] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] Destroying instance {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 706.958879] env[60164]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-8ed0a700-78ce-4200-9f94-922dd9fc783d {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.966783] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28e4aab7-ba6d-4d87-80d0-9def77c01a40 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.974057] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca760eae-4ce7-43e7-9122-9fb316e82523 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.996754] env[60164]: DEBUG nova.compute.provider_tree [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 707.005296] env[60164]: WARNING nova.virt.vmwareapi.vmops [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ef7b219e-437d-4b15-b559-ca5e2405efb2 could not be found. [ 707.005296] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] Instance destroyed {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 707.005296] env[60164]: INFO nova.compute.manager [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] Took 0.05 seconds to destroy the instance on the hypervisor. [ 707.005296] env[60164]: DEBUG oslo.service.loopingcall [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60164) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 707.005607] env[60164]: DEBUG nova.compute.manager [-] [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 707.005703] env[60164]: DEBUG nova.network.neutron [-] [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 707.009521] env[60164]: DEBUG nova.scheduler.client.report [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 707.026961] env[60164]: DEBUG oslo_concurrency.lockutils [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.428s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 707.027977] env[60164]: ERROR nova.compute.manager [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: e52a3adf-4654-43cd-8613-749277053ea8] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 53857d03-68f5-47d9-b7d9-b532a6d42fcf, please check neutron logs for more information. [ 707.027977] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] Traceback (most recent call last): [ 707.027977] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 707.027977] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] self.driver.spawn(context, instance, image_meta, [ 707.027977] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 707.027977] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 707.027977] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 707.027977] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] vm_ref = self.build_virtual_machine(instance, [ 707.027977] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 707.027977] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] vif_infos = vmwarevif.get_vif_info(self._session, [ 707.027977] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 707.028349] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] for vif in network_info: [ 707.028349] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 707.028349] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] return self._sync_wrapper(fn, *args, **kwargs) [ 707.028349] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 707.028349] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] self.wait() [ 707.028349] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 707.028349] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] self[:] = self._gt.wait() [ 707.028349] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 707.028349] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] return self._exit_event.wait() [ 707.028349] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 707.028349] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] result = hub.switch() [ 707.028349] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 707.028349] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] return self.greenlet.switch() [ 707.028349] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 707.028690] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] result = function(*args, **kwargs) [ 707.028690] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 707.028690] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] return func(*args, **kwargs) [ 707.028690] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 707.028690] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] raise e [ 707.028690] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 707.028690] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] nwinfo = self.network_api.allocate_for_instance( [ 707.028690] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 707.028690] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] created_port_ids = self._update_ports_for_instance( [ 707.028690] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 707.028690] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] with excutils.save_and_reraise_exception(): [ 707.028690] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 707.028690] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] self.force_reraise() [ 707.028998] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 707.028998] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] raise self.value [ 707.028998] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 707.028998] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] updated_port = self._update_port( [ 707.028998] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 707.028998] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] _ensure_no_port_binding_failure(port) [ 707.028998] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 707.028998] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] raise exception.PortBindingFailed(port_id=port['id']) [ 707.028998] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] nova.exception.PortBindingFailed: Binding failed for port 53857d03-68f5-47d9-b7d9-b532a6d42fcf, please check neutron logs for more information. [ 707.028998] env[60164]: ERROR nova.compute.manager [instance: e52a3adf-4654-43cd-8613-749277053ea8] [ 707.029258] env[60164]: DEBUG nova.compute.utils [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: e52a3adf-4654-43cd-8613-749277053ea8] Binding failed for port 53857d03-68f5-47d9-b7d9-b532a6d42fcf, please check neutron logs for more information. {{(pid=60164) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 707.030902] env[60164]: DEBUG oslo_concurrency.lockutils [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.181s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 707.032987] env[60164]: INFO nova.compute.claims [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 8fcf260d-2796-4972-b217-95954e309a6e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 707.035772] env[60164]: DEBUG nova.compute.manager [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: e52a3adf-4654-43cd-8613-749277053ea8] Build of instance e52a3adf-4654-43cd-8613-749277053ea8 was re-scheduled: Binding failed for port 53857d03-68f5-47d9-b7d9-b532a6d42fcf, please check neutron logs for more information. {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 707.036367] env[60164]: DEBUG nova.compute.manager [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: e52a3adf-4654-43cd-8613-749277053ea8] Unplugging VIFs for instance {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} [ 707.036367] env[60164]: DEBUG oslo_concurrency.lockutils [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Acquiring lock "refresh_cache-e52a3adf-4654-43cd-8613-749277053ea8" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 707.036778] env[60164]: DEBUG oslo_concurrency.lockutils [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Acquired lock "refresh_cache-e52a3adf-4654-43cd-8613-749277053ea8" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 707.036778] env[60164]: DEBUG nova.network.neutron [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: e52a3adf-4654-43cd-8613-749277053ea8] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 707.199034] env[60164]: DEBUG oslo_concurrency.lockutils [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Acquiring lock "c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 707.199034] env[60164]: DEBUG oslo_concurrency.lockutils [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Lock "c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 707.232130] env[60164]: DEBUG nova.network.neutron [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: e52a3adf-4654-43cd-8613-749277053ea8] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 707.237870] env[60164]: DEBUG nova.network.neutron [-] [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 707.245971] env[60164]: DEBUG nova.network.neutron [-] [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 707.256345] env[60164]: INFO nova.compute.manager [-] [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] Took 0.25 seconds to deallocate network for instance. [ 707.258312] env[60164]: DEBUG nova.compute.claims [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] Aborting claim: {{(pid=60164) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 707.258481] env[60164]: DEBUG oslo_concurrency.lockutils [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 707.337966] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1d776bb-993d-4316-9240-9c0f9c296261 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.346096] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b64742a4-38fe-4ca9-bb24-29660a605792 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.379024] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f0d8d5a-6251-47d0-b7de-e26f4707bcd5 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.385479] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44ddf5e9-3e57-4f58-919e-55df502c0e67 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.399079] env[60164]: DEBUG nova.compute.provider_tree [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 707.411707] env[60164]: DEBUG nova.scheduler.client.report [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 707.433496] env[60164]: DEBUG oslo_concurrency.lockutils [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.402s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 707.433979] env[60164]: DEBUG nova.compute.manager [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 8fcf260d-2796-4972-b217-95954e309a6e] Start building networks asynchronously for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} [ 707.436308] env[60164]: DEBUG oslo_concurrency.lockutils [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.178s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 707.471394] env[60164]: DEBUG nova.compute.utils [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Using /dev/sd instead of None {{(pid=60164) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 707.473042] env[60164]: DEBUG nova.compute.manager [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 8fcf260d-2796-4972-b217-95954e309a6e] Allocating IP information in the background. {{(pid=60164) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 707.473211] env[60164]: DEBUG nova.network.neutron [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 8fcf260d-2796-4972-b217-95954e309a6e] allocate_for_instance() {{(pid=60164) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1143}} [ 707.483579] env[60164]: DEBUG nova.compute.manager [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 8fcf260d-2796-4972-b217-95954e309a6e] Start building block device mappings for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} [ 707.593886] env[60164]: DEBUG nova.compute.manager [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 8fcf260d-2796-4972-b217-95954e309a6e] Start spawning the instance on the hypervisor. {{(pid=60164) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} [ 707.598922] env[60164]: DEBUG nova.network.neutron [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: e52a3adf-4654-43cd-8613-749277053ea8] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 707.618108] env[60164]: DEBUG oslo_concurrency.lockutils [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Releasing lock "refresh_cache-e52a3adf-4654-43cd-8613-749277053ea8" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 707.618373] env[60164]: DEBUG nova.compute.manager [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2984}} [ 707.618588] env[60164]: DEBUG nova.compute.manager [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: e52a3adf-4654-43cd-8613-749277053ea8] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 707.619099] env[60164]: DEBUG nova.network.neutron [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: e52a3adf-4654-43cd-8613-749277053ea8] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 707.624073] env[60164]: DEBUG nova.virt.hardware [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-15T19:01:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-15T19:01:11Z,direct_url=,disk_format='vmdk',id=1618eb55-f00d-42a5-b978-e81e57855fb4,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='4c19b57a5b764a6480c1f6e2a9cdd5e4',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-15T19:01:12Z,virtual_size=,visibility=), allow threads: False {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 707.624215] env[60164]: DEBUG nova.virt.hardware [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Flavor limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 707.625404] env[60164]: DEBUG nova.virt.hardware [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Image limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 707.625404] env[60164]: DEBUG nova.virt.hardware [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Flavor pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 707.625404] env[60164]: DEBUG nova.virt.hardware [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Image pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 707.625404] env[60164]: DEBUG nova.virt.hardware [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 707.625582] env[60164]: DEBUG nova.virt.hardware [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 707.626095] env[60164]: DEBUG nova.virt.hardware [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 707.626095] env[60164]: DEBUG nova.virt.hardware [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Got 1 possible topologies {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 707.626095] env[60164]: DEBUG nova.virt.hardware [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 707.626235] env[60164]: DEBUG nova.virt.hardware [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 707.632024] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2bd8299b-738b-47a3-9787-15452baf1f91 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.640208] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9d99b80-31c3-48aa-afad-447ef1216995 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.668093] env[60164]: DEBUG nova.policy [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4cc2b0ed84534852a16f9fdd4a8977f7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a98b1fd8031545e381db0682e508fc18', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60164) authorize /opt/stack/nova/nova/policy.py:203}} [ 707.676673] env[60164]: DEBUG nova.network.neutron [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: e52a3adf-4654-43cd-8613-749277053ea8] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 707.691662] env[60164]: DEBUG nova.network.neutron [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: e52a3adf-4654-43cd-8613-749277053ea8] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 707.704691] env[60164]: INFO nova.compute.manager [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: e52a3adf-4654-43cd-8613-749277053ea8] Took 0.09 seconds to deallocate network for instance. [ 707.732170] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-326b876d-9c89-4203-93cd-384f7aa3b7d3 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.739541] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3dbc1e83-aadd-4db1-bb87-5a4f6ceddb0f {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.777137] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89d919d5-311e-42b3-ac1d-4561cbf1a124 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.790439] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7623af2-8704-4fea-bb0c-d79a498522f3 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.807508] env[60164]: DEBUG nova.compute.provider_tree [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 707.820948] env[60164]: DEBUG nova.scheduler.client.report [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 707.835718] env[60164]: DEBUG oslo_concurrency.lockutils [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.399s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 707.836373] env[60164]: ERROR nova.compute.manager [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 05e585fb-54d8-4a6a-b92f-a1b020c55e21, please check neutron logs for more information. [ 707.836373] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] Traceback (most recent call last): [ 707.836373] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 707.836373] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] self.driver.spawn(context, instance, image_meta, [ 707.836373] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 707.836373] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 707.836373] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 707.836373] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] vm_ref = self.build_virtual_machine(instance, [ 707.836373] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 707.836373] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] vif_infos = vmwarevif.get_vif_info(self._session, [ 707.836373] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 707.837415] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] for vif in network_info: [ 707.837415] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 707.837415] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] return self._sync_wrapper(fn, *args, **kwargs) [ 707.837415] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 707.837415] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] self.wait() [ 707.837415] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 707.837415] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] self[:] = self._gt.wait() [ 707.837415] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 707.837415] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] return self._exit_event.wait() [ 707.837415] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 707.837415] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] result = hub.switch() [ 707.837415] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 707.837415] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] return self.greenlet.switch() [ 707.837415] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 707.837759] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] result = function(*args, **kwargs) [ 707.837759] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 707.837759] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] return func(*args, **kwargs) [ 707.837759] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 707.837759] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] raise e [ 707.837759] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 707.837759] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] nwinfo = self.network_api.allocate_for_instance( [ 707.837759] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 707.837759] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] created_port_ids = self._update_ports_for_instance( [ 707.837759] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 707.837759] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] with excutils.save_and_reraise_exception(): [ 707.837759] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 707.837759] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] self.force_reraise() [ 707.838103] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 707.838103] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] raise self.value [ 707.838103] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 707.838103] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] updated_port = self._update_port( [ 707.838103] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 707.838103] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] _ensure_no_port_binding_failure(port) [ 707.838103] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 707.838103] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] raise exception.PortBindingFailed(port_id=port['id']) [ 707.838103] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] nova.exception.PortBindingFailed: Binding failed for port 05e585fb-54d8-4a6a-b92f-a1b020c55e21, please check neutron logs for more information. [ 707.838103] env[60164]: ERROR nova.compute.manager [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] [ 707.838103] env[60164]: DEBUG nova.compute.utils [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] Binding failed for port 05e585fb-54d8-4a6a-b92f-a1b020c55e21, please check neutron logs for more information. {{(pid=60164) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 707.838683] env[60164]: DEBUG nova.compute.manager [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] Build of instance ef7b219e-437d-4b15-b559-ca5e2405efb2 was re-scheduled: Binding failed for port 05e585fb-54d8-4a6a-b92f-a1b020c55e21, please check neutron logs for more information. {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 707.839121] env[60164]: DEBUG nova.compute.manager [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] Unplugging VIFs for instance {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} [ 707.839345] env[60164]: DEBUG oslo_concurrency.lockutils [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] Acquiring lock "refresh_cache-ef7b219e-437d-4b15-b559-ca5e2405efb2" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 707.839631] env[60164]: DEBUG oslo_concurrency.lockutils [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] Acquired lock "refresh_cache-ef7b219e-437d-4b15-b559-ca5e2405efb2" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 707.839724] env[60164]: DEBUG nova.network.neutron [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 707.844423] env[60164]: INFO nova.scheduler.client.report [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Deleted allocations for instance e52a3adf-4654-43cd-8613-749277053ea8 [ 707.872277] env[60164]: DEBUG oslo_concurrency.lockutils [None req-e294a58d-5428-4f1a-816a-e9e5a9a27f37 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Lock "e52a3adf-4654-43cd-8613-749277053ea8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 11.329s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 707.888244] env[60164]: DEBUG nova.compute.manager [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] Starting instance... {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} [ 707.895404] env[60164]: DEBUG nova.network.neutron [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 707.943459] env[60164]: DEBUG oslo_concurrency.lockutils [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 707.943706] env[60164]: DEBUG oslo_concurrency.lockutils [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 707.945187] env[60164]: INFO nova.compute.claims [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 708.204078] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9368291e-7f7d-4c61-97df-be4cf3b96ab4 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 708.212478] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8cbdc8c-291f-4314-80e9-af9efa29b9c4 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 708.246334] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d381237-3e57-42e9-a99e-ccef50c6b950 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 708.254156] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-54ab7b5a-88a9-4835-9870-b34d77e37bca {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 708.269185] env[60164]: DEBUG nova.compute.provider_tree [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 708.277863] env[60164]: DEBUG nova.scheduler.client.report [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 708.294318] env[60164]: DEBUG oslo_concurrency.lockutils [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.350s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 708.295024] env[60164]: DEBUG nova.compute.manager [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] Start building networks asynchronously for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} [ 708.337861] env[60164]: DEBUG nova.compute.utils [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Using /dev/sd instead of None {{(pid=60164) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 708.339338] env[60164]: DEBUG nova.compute.manager [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] Allocating IP information in the background. {{(pid=60164) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 708.339338] env[60164]: DEBUG nova.network.neutron [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] allocate_for_instance() {{(pid=60164) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1143}} [ 708.353468] env[60164]: DEBUG nova.compute.manager [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] Start building block device mappings for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} [ 708.434377] env[60164]: DEBUG nova.compute.manager [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] Start spawning the instance on the hypervisor. {{(pid=60164) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} [ 708.462421] env[60164]: DEBUG nova.virt.hardware [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-15T19:01:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-15T19:01:11Z,direct_url=,disk_format='vmdk',id=1618eb55-f00d-42a5-b978-e81e57855fb4,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='4c19b57a5b764a6480c1f6e2a9cdd5e4',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-15T19:01:12Z,virtual_size=,visibility=), allow threads: False {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 708.462999] env[60164]: DEBUG nova.virt.hardware [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Flavor limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 708.462999] env[60164]: DEBUG nova.virt.hardware [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Image limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 708.462999] env[60164]: DEBUG nova.virt.hardware [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Flavor pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 708.463428] env[60164]: DEBUG nova.virt.hardware [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Image pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 708.463428] env[60164]: DEBUG nova.virt.hardware [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 708.463533] env[60164]: DEBUG nova.virt.hardware [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 708.463618] env[60164]: DEBUG nova.virt.hardware [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 708.463781] env[60164]: DEBUG nova.virt.hardware [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Got 1 possible topologies {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 708.463938] env[60164]: DEBUG nova.virt.hardware [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 708.464201] env[60164]: DEBUG nova.virt.hardware [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 708.465274] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2392a30e-795d-4e73-8de9-ada7675261f1 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 708.473870] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28fa0033-251f-43e3-8355-6393327f68f7 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 708.500688] env[60164]: DEBUG nova.network.neutron [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 708.512761] env[60164]: DEBUG oslo_concurrency.lockutils [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] Releasing lock "refresh_cache-ef7b219e-437d-4b15-b559-ca5e2405efb2" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 708.512995] env[60164]: DEBUG nova.compute.manager [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2984}} [ 708.513193] env[60164]: DEBUG nova.compute.manager [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 708.513361] env[60164]: DEBUG nova.network.neutron [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 708.575189] env[60164]: DEBUG nova.network.neutron [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 708.584879] env[60164]: DEBUG nova.network.neutron [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 708.597683] env[60164]: INFO nova.compute.manager [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] [instance: ef7b219e-437d-4b15-b559-ca5e2405efb2] Took 0.08 seconds to deallocate network for instance. [ 708.692533] env[60164]: INFO nova.scheduler.client.report [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] Deleted allocations for instance ef7b219e-437d-4b15-b559-ca5e2405efb2 [ 708.718383] env[60164]: DEBUG oslo_concurrency.lockutils [None req-85d39ece-81cb-4714-a505-602e6aa6d34a tempest-ServerRescueTestJSON-406518722 tempest-ServerRescueTestJSON-406518722-project-member] Lock "ef7b219e-437d-4b15-b559-ca5e2405efb2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 11.938s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 708.738441] env[60164]: DEBUG nova.compute.manager [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] Starting instance... {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} [ 708.819409] env[60164]: DEBUG oslo_concurrency.lockutils [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 708.819688] env[60164]: DEBUG oslo_concurrency.lockutils [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 708.821243] env[60164]: INFO nova.compute.claims [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 708.844704] env[60164]: DEBUG nova.policy [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '37788e4056b84ab0b461767fad9e3955', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9af4fbea46444d81b8ed5dd844ce87d4', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60164) authorize /opt/stack/nova/nova/policy.py:203}} [ 708.951186] env[60164]: ERROR nova.compute.manager [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 00b78937-7d5e-4965-91a5-30f8f3c29b85, please check neutron logs for more information. [ 708.951186] env[60164]: ERROR nova.compute.manager Traceback (most recent call last): [ 708.951186] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 708.951186] env[60164]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 708.951186] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 708.951186] env[60164]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 708.951186] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 708.951186] env[60164]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 708.951186] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 708.951186] env[60164]: ERROR nova.compute.manager self.force_reraise() [ 708.951186] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 708.951186] env[60164]: ERROR nova.compute.manager raise self.value [ 708.951186] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 708.951186] env[60164]: ERROR nova.compute.manager updated_port = self._update_port( [ 708.951186] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 708.951186] env[60164]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 708.952295] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 708.952295] env[60164]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 708.952295] env[60164]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 00b78937-7d5e-4965-91a5-30f8f3c29b85, please check neutron logs for more information. [ 708.952295] env[60164]: ERROR nova.compute.manager [ 708.952295] env[60164]: Traceback (most recent call last): [ 708.952295] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 708.952295] env[60164]: listener.cb(fileno) [ 708.952295] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 708.952295] env[60164]: result = function(*args, **kwargs) [ 708.952295] env[60164]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 708.952295] env[60164]: return func(*args, **kwargs) [ 708.952295] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 708.952295] env[60164]: raise e [ 708.952295] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 708.952295] env[60164]: nwinfo = self.network_api.allocate_for_instance( [ 708.952295] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 708.952295] env[60164]: created_port_ids = self._update_ports_for_instance( [ 708.952295] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 708.952295] env[60164]: with excutils.save_and_reraise_exception(): [ 708.952295] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 708.952295] env[60164]: self.force_reraise() [ 708.952295] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 708.952295] env[60164]: raise self.value [ 708.952295] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 708.952295] env[60164]: updated_port = self._update_port( [ 708.952295] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 708.952295] env[60164]: _ensure_no_port_binding_failure(port) [ 708.952295] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 708.952295] env[60164]: raise exception.PortBindingFailed(port_id=port['id']) [ 708.953239] env[60164]: nova.exception.PortBindingFailed: Binding failed for port 00b78937-7d5e-4965-91a5-30f8f3c29b85, please check neutron logs for more information. [ 708.953239] env[60164]: Removing descriptor: 20 [ 708.953239] env[60164]: ERROR nova.compute.manager [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 00b78937-7d5e-4965-91a5-30f8f3c29b85, please check neutron logs for more information. [ 708.953239] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] Traceback (most recent call last): [ 708.953239] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] File "/opt/stack/nova/nova/compute/manager.py", line 2849, in _build_resources [ 708.953239] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] yield resources [ 708.953239] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 708.953239] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] self.driver.spawn(context, instance, image_meta, [ 708.953239] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 708.953239] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 708.953239] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 708.953239] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] vm_ref = self.build_virtual_machine(instance, [ 708.953522] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 708.953522] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] vif_infos = vmwarevif.get_vif_info(self._session, [ 708.953522] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 708.953522] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] for vif in network_info: [ 708.953522] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 708.953522] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] return self._sync_wrapper(fn, *args, **kwargs) [ 708.953522] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 708.953522] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] self.wait() [ 708.953522] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 708.953522] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] self[:] = self._gt.wait() [ 708.953522] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 708.953522] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] return self._exit_event.wait() [ 708.953522] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 708.953522] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] result = hub.switch() [ 708.953860] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 708.953860] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] return self.greenlet.switch() [ 708.953860] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 708.953860] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] result = function(*args, **kwargs) [ 708.953860] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 708.953860] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] return func(*args, **kwargs) [ 708.953860] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 708.953860] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] raise e [ 708.953860] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 708.953860] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] nwinfo = self.network_api.allocate_for_instance( [ 708.953860] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 708.953860] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] created_port_ids = self._update_ports_for_instance( [ 708.953860] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 708.954193] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] with excutils.save_and_reraise_exception(): [ 708.954193] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 708.954193] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] self.force_reraise() [ 708.954193] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 708.954193] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] raise self.value [ 708.954193] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 708.954193] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] updated_port = self._update_port( [ 708.954193] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 708.954193] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] _ensure_no_port_binding_failure(port) [ 708.954193] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 708.954193] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] raise exception.PortBindingFailed(port_id=port['id']) [ 708.954193] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] nova.exception.PortBindingFailed: Binding failed for port 00b78937-7d5e-4965-91a5-30f8f3c29b85, please check neutron logs for more information. [ 708.954193] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] [ 708.954778] env[60164]: INFO nova.compute.manager [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] Terminating instance [ 708.958157] env[60164]: DEBUG oslo_concurrency.lockutils [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] Acquiring lock "refresh_cache-9bee98ef-48b4-47e6-8afb-e535e58e50cb" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 708.958323] env[60164]: DEBUG oslo_concurrency.lockutils [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] Acquired lock "refresh_cache-9bee98ef-48b4-47e6-8afb-e535e58e50cb" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 708.958517] env[60164]: DEBUG nova.network.neutron [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 709.045886] env[60164]: DEBUG nova.network.neutron [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 8fcf260d-2796-4972-b217-95954e309a6e] Successfully created port: 0e04edf6-7d03-4368-98ac-203be2fde2ed {{(pid=60164) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 709.051141] env[60164]: DEBUG nova.network.neutron [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 709.097981] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1872b341-43cf-4041-bcbd-fde39fac12a3 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 709.108015] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03f8c381-22ab-4f20-a4ce-2ba5e729be2f {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 709.143231] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7361636-eb47-4c43-a52b-479915ae639c {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 709.151027] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5a1c5c8-3031-4ca9-a282-fc96e1f31e5e {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 709.166930] env[60164]: DEBUG nova.compute.provider_tree [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 709.178012] env[60164]: DEBUG nova.scheduler.client.report [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 709.197278] env[60164]: DEBUG oslo_concurrency.lockutils [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.377s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 709.197785] env[60164]: DEBUG nova.compute.manager [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] Start building networks asynchronously for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} [ 709.247377] env[60164]: DEBUG nova.compute.utils [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Using /dev/sd instead of None {{(pid=60164) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 709.248657] env[60164]: DEBUG nova.compute.manager [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] Allocating IP information in the background. {{(pid=60164) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 709.248781] env[60164]: DEBUG nova.network.neutron [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] allocate_for_instance() {{(pid=60164) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1143}} [ 709.259094] env[60164]: DEBUG nova.compute.manager [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] Start building block device mappings for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} [ 709.329045] env[60164]: DEBUG nova.compute.manager [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] Start spawning the instance on the hypervisor. {{(pid=60164) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} [ 709.355144] env[60164]: DEBUG nova.virt.hardware [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-15T19:01:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-15T19:01:11Z,direct_url=,disk_format='vmdk',id=1618eb55-f00d-42a5-b978-e81e57855fb4,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='4c19b57a5b764a6480c1f6e2a9cdd5e4',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-15T19:01:12Z,virtual_size=,visibility=), allow threads: False {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 709.355439] env[60164]: DEBUG nova.virt.hardware [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Flavor limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 709.355695] env[60164]: DEBUG nova.virt.hardware [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Image limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 709.355815] env[60164]: DEBUG nova.virt.hardware [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Flavor pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 709.355976] env[60164]: DEBUG nova.virt.hardware [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Image pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 709.356134] env[60164]: DEBUG nova.virt.hardware [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 709.356447] env[60164]: DEBUG nova.virt.hardware [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 709.356537] env[60164]: DEBUG nova.virt.hardware [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 709.356628] env[60164]: DEBUG nova.virt.hardware [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Got 1 possible topologies {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 709.356849] env[60164]: DEBUG nova.virt.hardware [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 709.356976] env[60164]: DEBUG nova.virt.hardware [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 709.358250] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e17a6f7d-b4b2-4bb7-ad65-6812f61a8b9b {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 709.368140] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0f3a0a4-2f7b-4934-85f8-9c3834f1312d {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 709.401105] env[60164]: DEBUG nova.network.neutron [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 709.413031] env[60164]: DEBUG oslo_concurrency.lockutils [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] Releasing lock "refresh_cache-9bee98ef-48b4-47e6-8afb-e535e58e50cb" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 709.413031] env[60164]: DEBUG nova.compute.manager [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] Start destroying the instance on the hypervisor. {{(pid=60164) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} [ 709.413196] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] Destroying instance {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 709.413693] env[60164]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-16c9cc24-c837-4673-9b67-1513358f24b4 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 709.422656] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-296768db-6b3a-4ed4-8b3a-805fd4a6f4a1 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 709.448941] env[60164]: WARNING nova.virt.vmwareapi.vmops [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 9bee98ef-48b4-47e6-8afb-e535e58e50cb could not be found. [ 709.449205] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] Instance destroyed {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 709.449354] env[60164]: INFO nova.compute.manager [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] Took 0.04 seconds to destroy the instance on the hypervisor. [ 709.449615] env[60164]: DEBUG oslo.service.loopingcall [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60164) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 709.449843] env[60164]: DEBUG nova.compute.manager [-] [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 709.449939] env[60164]: DEBUG nova.network.neutron [-] [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 709.459545] env[60164]: DEBUG nova.policy [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '38afd0b6a9a24556bd374e62b3363f3d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '15c6dd50db8b44d190cdaaf8e69222da', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60164) authorize /opt/stack/nova/nova/policy.py:203}} [ 709.886307] env[60164]: DEBUG nova.network.neutron [-] [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 710.186106] env[60164]: DEBUG oslo_concurrency.lockutils [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Acquiring lock "fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 710.187661] env[60164]: DEBUG oslo_concurrency.lockutils [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Lock "fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.004s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 710.482654] env[60164]: DEBUG oslo_concurrency.lockutils [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Acquiring lock "43fbb2e2-b827-4fc0-aff4-886a26f4550e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 710.483333] env[60164]: DEBUG oslo_concurrency.lockutils [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Lock "43fbb2e2-b827-4fc0-aff4-886a26f4550e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 710.597567] env[60164]: DEBUG nova.network.neutron [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] Successfully created port: 6e276e45-fe27-414d-ba81-a3de27e5773a {{(pid=60164) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 711.229097] env[60164]: DEBUG nova.network.neutron [-] [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 711.243021] env[60164]: INFO nova.compute.manager [-] [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] Took 1.79 seconds to deallocate network for instance. [ 711.244101] env[60164]: DEBUG nova.compute.claims [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] Aborting claim: {{(pid=60164) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 711.244267] env[60164]: DEBUG oslo_concurrency.lockutils [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 711.244471] env[60164]: DEBUG oslo_concurrency.lockutils [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 711.316421] env[60164]: DEBUG nova.network.neutron [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] Successfully created port: 22a86f22-b09e-42d4-94fe-94f6c03a4a0b {{(pid=60164) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 711.496869] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76029567-65e5-4d9b-b841-cb10a721f4a8 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 711.505006] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-748877f4-01ee-44bb-af31-d6abd208a8fa {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 711.537259] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-caadfd75-b4dd-4362-bacb-81a6e26f4130 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 711.545053] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e77e02a4-8aa9-4282-b388-a3854b33da50 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 711.558929] env[60164]: DEBUG nova.compute.provider_tree [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 711.574811] env[60164]: DEBUG nova.scheduler.client.report [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 711.594430] env[60164]: DEBUG oslo_concurrency.lockutils [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.350s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 711.595056] env[60164]: ERROR nova.compute.manager [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 00b78937-7d5e-4965-91a5-30f8f3c29b85, please check neutron logs for more information. [ 711.595056] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] Traceback (most recent call last): [ 711.595056] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 711.595056] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] self.driver.spawn(context, instance, image_meta, [ 711.595056] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 711.595056] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 711.595056] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 711.595056] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] vm_ref = self.build_virtual_machine(instance, [ 711.595056] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 711.595056] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] vif_infos = vmwarevif.get_vif_info(self._session, [ 711.595056] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 711.595412] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] for vif in network_info: [ 711.595412] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 711.595412] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] return self._sync_wrapper(fn, *args, **kwargs) [ 711.595412] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 711.595412] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] self.wait() [ 711.595412] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 711.595412] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] self[:] = self._gt.wait() [ 711.595412] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 711.595412] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] return self._exit_event.wait() [ 711.595412] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 711.595412] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] result = hub.switch() [ 711.595412] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 711.595412] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] return self.greenlet.switch() [ 711.595412] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 711.595818] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] result = function(*args, **kwargs) [ 711.595818] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 711.595818] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] return func(*args, **kwargs) [ 711.595818] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 711.595818] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] raise e [ 711.595818] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 711.595818] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] nwinfo = self.network_api.allocate_for_instance( [ 711.595818] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 711.595818] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] created_port_ids = self._update_ports_for_instance( [ 711.595818] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 711.595818] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] with excutils.save_and_reraise_exception(): [ 711.595818] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 711.595818] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] self.force_reraise() [ 711.596182] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 711.596182] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] raise self.value [ 711.596182] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 711.596182] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] updated_port = self._update_port( [ 711.596182] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 711.596182] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] _ensure_no_port_binding_failure(port) [ 711.596182] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 711.596182] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] raise exception.PortBindingFailed(port_id=port['id']) [ 711.596182] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] nova.exception.PortBindingFailed: Binding failed for port 00b78937-7d5e-4965-91a5-30f8f3c29b85, please check neutron logs for more information. [ 711.596182] env[60164]: ERROR nova.compute.manager [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] [ 711.596182] env[60164]: DEBUG nova.compute.utils [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] Binding failed for port 00b78937-7d5e-4965-91a5-30f8f3c29b85, please check neutron logs for more information. {{(pid=60164) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 711.597172] env[60164]: DEBUG nova.compute.manager [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] Build of instance 9bee98ef-48b4-47e6-8afb-e535e58e50cb was re-scheduled: Binding failed for port 00b78937-7d5e-4965-91a5-30f8f3c29b85, please check neutron logs for more information. {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 711.597629] env[60164]: DEBUG nova.compute.manager [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] Unplugging VIFs for instance {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} [ 711.597862] env[60164]: DEBUG oslo_concurrency.lockutils [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] Acquiring lock "refresh_cache-9bee98ef-48b4-47e6-8afb-e535e58e50cb" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 711.598014] env[60164]: DEBUG oslo_concurrency.lockutils [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] Acquired lock "refresh_cache-9bee98ef-48b4-47e6-8afb-e535e58e50cb" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 711.598181] env[60164]: DEBUG nova.network.neutron [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 711.677307] env[60164]: DEBUG nova.network.neutron [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 711.996695] env[60164]: ERROR nova.compute.manager [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 9c7bd52d-273d-4d17-8e75-b836df862857, please check neutron logs for more information. [ 711.996695] env[60164]: ERROR nova.compute.manager Traceback (most recent call last): [ 711.996695] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 711.996695] env[60164]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 711.996695] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 711.996695] env[60164]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 711.996695] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 711.996695] env[60164]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 711.996695] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 711.996695] env[60164]: ERROR nova.compute.manager self.force_reraise() [ 711.996695] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 711.996695] env[60164]: ERROR nova.compute.manager raise self.value [ 711.996695] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 711.996695] env[60164]: ERROR nova.compute.manager updated_port = self._update_port( [ 711.996695] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 711.996695] env[60164]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 711.997241] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 711.997241] env[60164]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 711.997241] env[60164]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 9c7bd52d-273d-4d17-8e75-b836df862857, please check neutron logs for more information. [ 711.997241] env[60164]: ERROR nova.compute.manager [ 711.997241] env[60164]: Traceback (most recent call last): [ 711.997241] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 711.997241] env[60164]: listener.cb(fileno) [ 711.997241] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 711.997241] env[60164]: result = function(*args, **kwargs) [ 711.997241] env[60164]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 711.997241] env[60164]: return func(*args, **kwargs) [ 711.997241] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 711.997241] env[60164]: raise e [ 711.997241] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 711.997241] env[60164]: nwinfo = self.network_api.allocate_for_instance( [ 711.997241] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 711.997241] env[60164]: created_port_ids = self._update_ports_for_instance( [ 711.997241] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 711.997241] env[60164]: with excutils.save_and_reraise_exception(): [ 711.997241] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 711.997241] env[60164]: self.force_reraise() [ 711.997241] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 711.997241] env[60164]: raise self.value [ 711.997241] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 711.997241] env[60164]: updated_port = self._update_port( [ 711.997241] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 711.997241] env[60164]: _ensure_no_port_binding_failure(port) [ 711.997241] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 711.997241] env[60164]: raise exception.PortBindingFailed(port_id=port['id']) [ 711.998132] env[60164]: nova.exception.PortBindingFailed: Binding failed for port 9c7bd52d-273d-4d17-8e75-b836df862857, please check neutron logs for more information. [ 711.998132] env[60164]: Removing descriptor: 14 [ 711.998132] env[60164]: ERROR nova.compute.manager [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 9c7bd52d-273d-4d17-8e75-b836df862857, please check neutron logs for more information. [ 711.998132] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] Traceback (most recent call last): [ 711.998132] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] File "/opt/stack/nova/nova/compute/manager.py", line 2849, in _build_resources [ 711.998132] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] yield resources [ 711.998132] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 711.998132] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] self.driver.spawn(context, instance, image_meta, [ 711.998132] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 711.998132] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 711.998132] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 711.998132] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] vm_ref = self.build_virtual_machine(instance, [ 711.998612] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 711.998612] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] vif_infos = vmwarevif.get_vif_info(self._session, [ 711.998612] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 711.998612] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] for vif in network_info: [ 711.998612] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 711.998612] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] return self._sync_wrapper(fn, *args, **kwargs) [ 711.998612] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 711.998612] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] self.wait() [ 711.998612] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 711.998612] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] self[:] = self._gt.wait() [ 711.998612] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 711.998612] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] return self._exit_event.wait() [ 711.998612] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 711.998612] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] result = hub.switch() [ 711.998970] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 711.998970] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] return self.greenlet.switch() [ 711.998970] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 711.998970] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] result = function(*args, **kwargs) [ 711.998970] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 711.998970] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] return func(*args, **kwargs) [ 711.998970] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 711.998970] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] raise e [ 711.998970] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 711.998970] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] nwinfo = self.network_api.allocate_for_instance( [ 711.998970] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 711.998970] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] created_port_ids = self._update_ports_for_instance( [ 711.998970] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 712.000851] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] with excutils.save_and_reraise_exception(): [ 712.000851] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 712.000851] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] self.force_reraise() [ 712.000851] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 712.000851] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] raise self.value [ 712.000851] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 712.000851] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] updated_port = self._update_port( [ 712.000851] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 712.000851] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] _ensure_no_port_binding_failure(port) [ 712.000851] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 712.000851] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] raise exception.PortBindingFailed(port_id=port['id']) [ 712.000851] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] nova.exception.PortBindingFailed: Binding failed for port 9c7bd52d-273d-4d17-8e75-b836df862857, please check neutron logs for more information. [ 712.000851] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] [ 712.002183] env[60164]: INFO nova.compute.manager [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] Terminating instance [ 712.002183] env[60164]: DEBUG oslo_concurrency.lockutils [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] Acquiring lock "refresh_cache-9e88b24c-500d-4efb-8563-093dd4d0378d" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 712.002183] env[60164]: DEBUG oslo_concurrency.lockutils [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] Acquired lock "refresh_cache-9e88b24c-500d-4efb-8563-093dd4d0378d" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 712.004477] env[60164]: DEBUG nova.network.neutron [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 712.101308] env[60164]: DEBUG nova.network.neutron [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 712.252054] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] Acquiring lock "b01c69b3-eec6-4577-8288-d4602da9e251" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 712.252475] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] Lock "b01c69b3-eec6-4577-8288-d4602da9e251" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 712.383285] env[60164]: DEBUG nova.network.neutron [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 712.395221] env[60164]: DEBUG oslo_concurrency.lockutils [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] Releasing lock "refresh_cache-9bee98ef-48b4-47e6-8afb-e535e58e50cb" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 712.395221] env[60164]: DEBUG nova.compute.manager [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2984}} [ 712.395221] env[60164]: DEBUG nova.compute.manager [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 712.395221] env[60164]: DEBUG nova.network.neutron [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 712.454698] env[60164]: DEBUG nova.network.neutron [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 712.470533] env[60164]: DEBUG nova.network.neutron [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 712.480891] env[60164]: INFO nova.compute.manager [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] [instance: 9bee98ef-48b4-47e6-8afb-e535e58e50cb] Took 0.09 seconds to deallocate network for instance. [ 712.530055] env[60164]: DEBUG nova.network.neutron [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 712.541414] env[60164]: DEBUG oslo_concurrency.lockutils [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] Releasing lock "refresh_cache-9e88b24c-500d-4efb-8563-093dd4d0378d" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 712.541414] env[60164]: DEBUG nova.compute.manager [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] Start destroying the instance on the hypervisor. {{(pid=60164) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} [ 712.541414] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] Destroying instance {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 712.541764] env[60164]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-177c0c5f-2160-4ea5-b442-d4de9f30ae51 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 712.554500] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5fc8d3a-3315-48e9-92d8-71504548eb77 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 712.588950] env[60164]: WARNING nova.virt.vmwareapi.vmops [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 9e88b24c-500d-4efb-8563-093dd4d0378d could not be found. [ 712.588950] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] Instance destroyed {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 712.588950] env[60164]: INFO nova.compute.manager [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] Took 0.05 seconds to destroy the instance on the hypervisor. [ 712.588950] env[60164]: DEBUG oslo.service.loopingcall [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60164) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 712.588950] env[60164]: DEBUG nova.compute.manager [-] [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 712.589339] env[60164]: DEBUG nova.network.neutron [-] [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 712.599084] env[60164]: INFO nova.scheduler.client.report [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] Deleted allocations for instance 9bee98ef-48b4-47e6-8afb-e535e58e50cb [ 712.622923] env[60164]: DEBUG nova.network.neutron [-] [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 712.627480] env[60164]: DEBUG oslo_concurrency.lockutils [None req-616d50c4-67f7-421a-9f90-205c09c00959 tempest-ServersTestMultiNic-1861207233 tempest-ServersTestMultiNic-1861207233-project-member] Lock "9bee98ef-48b4-47e6-8afb-e535e58e50cb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 13.876s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 712.639947] env[60164]: DEBUG nova.network.neutron [-] [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 712.646651] env[60164]: DEBUG nova.compute.manager [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] Starting instance... {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} [ 712.652740] env[60164]: INFO nova.compute.manager [-] [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] Took 0.06 seconds to deallocate network for instance. [ 712.652740] env[60164]: DEBUG nova.compute.claims [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] Aborting claim: {{(pid=60164) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 712.652740] env[60164]: DEBUG oslo_concurrency.lockutils [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 712.653127] env[60164]: DEBUG oslo_concurrency.lockutils [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 712.735822] env[60164]: DEBUG oslo_concurrency.lockutils [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 712.985522] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24de7835-c467-44cc-a6ad-17be7276f57c {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 712.996181] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0efd61dc-24a5-4a63-a04d-9de9325aa97e {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 713.031381] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2878791-7631-4e3f-87fc-064de7dca6f0 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 713.039567] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3508ddc5-fc1b-4199-8c29-9390e1af2149 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 713.054530] env[60164]: DEBUG nova.compute.provider_tree [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 713.063981] env[60164]: DEBUG nova.scheduler.client.report [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 713.078340] env[60164]: DEBUG oslo_concurrency.lockutils [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.425s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 713.078959] env[60164]: ERROR nova.compute.manager [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 9c7bd52d-273d-4d17-8e75-b836df862857, please check neutron logs for more information. [ 713.078959] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] Traceback (most recent call last): [ 713.078959] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 713.078959] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] self.driver.spawn(context, instance, image_meta, [ 713.078959] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 713.078959] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 713.078959] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 713.078959] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] vm_ref = self.build_virtual_machine(instance, [ 713.078959] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 713.078959] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] vif_infos = vmwarevif.get_vif_info(self._session, [ 713.078959] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 713.079350] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] for vif in network_info: [ 713.079350] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 713.079350] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] return self._sync_wrapper(fn, *args, **kwargs) [ 713.079350] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 713.079350] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] self.wait() [ 713.079350] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 713.079350] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] self[:] = self._gt.wait() [ 713.079350] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 713.079350] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] return self._exit_event.wait() [ 713.079350] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 713.079350] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] result = hub.switch() [ 713.079350] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 713.079350] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] return self.greenlet.switch() [ 713.079350] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 713.079768] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] result = function(*args, **kwargs) [ 713.079768] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 713.079768] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] return func(*args, **kwargs) [ 713.079768] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 713.079768] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] raise e [ 713.079768] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 713.079768] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] nwinfo = self.network_api.allocate_for_instance( [ 713.079768] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 713.079768] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] created_port_ids = self._update_ports_for_instance( [ 713.079768] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 713.079768] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] with excutils.save_and_reraise_exception(): [ 713.079768] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 713.079768] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] self.force_reraise() [ 713.080112] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 713.080112] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] raise self.value [ 713.080112] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 713.080112] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] updated_port = self._update_port( [ 713.080112] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 713.080112] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] _ensure_no_port_binding_failure(port) [ 713.080112] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 713.080112] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] raise exception.PortBindingFailed(port_id=port['id']) [ 713.080112] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] nova.exception.PortBindingFailed: Binding failed for port 9c7bd52d-273d-4d17-8e75-b836df862857, please check neutron logs for more information. [ 713.080112] env[60164]: ERROR nova.compute.manager [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] [ 713.080435] env[60164]: DEBUG nova.compute.utils [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] Binding failed for port 9c7bd52d-273d-4d17-8e75-b836df862857, please check neutron logs for more information. {{(pid=60164) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 713.081176] env[60164]: DEBUG nova.compute.manager [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] Build of instance 9e88b24c-500d-4efb-8563-093dd4d0378d was re-scheduled: Binding failed for port 9c7bd52d-273d-4d17-8e75-b836df862857, please check neutron logs for more information. {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 713.081642] env[60164]: DEBUG nova.compute.manager [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] Unplugging VIFs for instance {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} [ 713.081796] env[60164]: DEBUG oslo_concurrency.lockutils [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] Acquiring lock "refresh_cache-9e88b24c-500d-4efb-8563-093dd4d0378d" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 713.081937] env[60164]: DEBUG oslo_concurrency.lockutils [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] Acquired lock "refresh_cache-9e88b24c-500d-4efb-8563-093dd4d0378d" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 713.082111] env[60164]: DEBUG nova.network.neutron [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 713.083934] env[60164]: DEBUG oslo_concurrency.lockutils [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.348s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 713.084985] env[60164]: INFO nova.compute.claims [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 713.169624] env[60164]: DEBUG nova.network.neutron [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 713.344670] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32678f4c-6029-4a3e-896a-a086bc511977 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 713.352770] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e5d3aed8-90f1-45b2-ba49-6dd7931bff76 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 713.393251] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7703e9b2-5f48-4ac3-a7f7-183770a53580 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 713.402572] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b6979e1-e37f-4643-b5ba-adc8879f1a21 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 713.418018] env[60164]: DEBUG nova.compute.provider_tree [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 713.428964] env[60164]: DEBUG nova.scheduler.client.report [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 713.444789] env[60164]: DEBUG oslo_concurrency.lockutils [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.361s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 713.445312] env[60164]: DEBUG nova.compute.manager [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] Start building networks asynchronously for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} [ 713.480096] env[60164]: DEBUG nova.compute.utils [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Using /dev/sd instead of None {{(pid=60164) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 713.482053] env[60164]: DEBUG nova.compute.manager [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] Allocating IP information in the background. {{(pid=60164) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 713.482371] env[60164]: DEBUG nova.network.neutron [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] allocate_for_instance() {{(pid=60164) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1143}} [ 713.493710] env[60164]: DEBUG nova.compute.manager [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] Start building block device mappings for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} [ 713.500090] env[60164]: ERROR nova.compute.manager [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port d075e52f-7ad7-43b6-8bcc-5b50c7a4ec75, please check neutron logs for more information. [ 713.500090] env[60164]: ERROR nova.compute.manager Traceback (most recent call last): [ 713.500090] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 713.500090] env[60164]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 713.500090] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 713.500090] env[60164]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 713.500090] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 713.500090] env[60164]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 713.500090] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 713.500090] env[60164]: ERROR nova.compute.manager self.force_reraise() [ 713.500090] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 713.500090] env[60164]: ERROR nova.compute.manager raise self.value [ 713.500090] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 713.500090] env[60164]: ERROR nova.compute.manager updated_port = self._update_port( [ 713.500090] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 713.500090] env[60164]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 713.500548] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 713.500548] env[60164]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 713.500548] env[60164]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port d075e52f-7ad7-43b6-8bcc-5b50c7a4ec75, please check neutron logs for more information. [ 713.500548] env[60164]: ERROR nova.compute.manager [ 713.501736] env[60164]: Traceback (most recent call last): [ 713.501736] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 713.501736] env[60164]: listener.cb(fileno) [ 713.501736] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 713.501736] env[60164]: result = function(*args, **kwargs) [ 713.501736] env[60164]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 713.501736] env[60164]: return func(*args, **kwargs) [ 713.501736] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 713.501736] env[60164]: raise e [ 713.501736] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 713.501736] env[60164]: nwinfo = self.network_api.allocate_for_instance( [ 713.501736] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 713.501736] env[60164]: created_port_ids = self._update_ports_for_instance( [ 713.501736] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 713.501736] env[60164]: with excutils.save_and_reraise_exception(): [ 713.501736] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 713.501736] env[60164]: self.force_reraise() [ 713.501736] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 713.501736] env[60164]: raise self.value [ 713.501736] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 713.501736] env[60164]: updated_port = self._update_port( [ 713.501736] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 713.501736] env[60164]: _ensure_no_port_binding_failure(port) [ 713.501736] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 713.501736] env[60164]: raise exception.PortBindingFailed(port_id=port['id']) [ 713.501736] env[60164]: nova.exception.PortBindingFailed: Binding failed for port d075e52f-7ad7-43b6-8bcc-5b50c7a4ec75, please check neutron logs for more information. [ 713.501736] env[60164]: Removing descriptor: 18 [ 713.504018] env[60164]: ERROR nova.compute.manager [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port d075e52f-7ad7-43b6-8bcc-5b50c7a4ec75, please check neutron logs for more information. [ 713.504018] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] Traceback (most recent call last): [ 713.504018] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] File "/opt/stack/nova/nova/compute/manager.py", line 2849, in _build_resources [ 713.504018] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] yield resources [ 713.504018] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 713.504018] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] self.driver.spawn(context, instance, image_meta, [ 713.504018] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 713.504018] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 713.504018] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 713.504018] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] vm_ref = self.build_virtual_machine(instance, [ 713.504018] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 713.504356] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] vif_infos = vmwarevif.get_vif_info(self._session, [ 713.504356] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 713.504356] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] for vif in network_info: [ 713.504356] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 713.504356] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] return self._sync_wrapper(fn, *args, **kwargs) [ 713.504356] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 713.504356] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] self.wait() [ 713.504356] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 713.504356] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] self[:] = self._gt.wait() [ 713.504356] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 713.504356] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] return self._exit_event.wait() [ 713.504356] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 713.504356] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] result = hub.switch() [ 713.504356] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 713.504738] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] return self.greenlet.switch() [ 713.504738] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 713.504738] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] result = function(*args, **kwargs) [ 713.504738] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 713.504738] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] return func(*args, **kwargs) [ 713.504738] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 713.504738] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] raise e [ 713.504738] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 713.504738] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] nwinfo = self.network_api.allocate_for_instance( [ 713.504738] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 713.504738] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] created_port_ids = self._update_ports_for_instance( [ 713.504738] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 713.504738] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] with excutils.save_and_reraise_exception(): [ 713.505088] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 713.505088] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] self.force_reraise() [ 713.505088] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 713.505088] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] raise self.value [ 713.505088] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 713.505088] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] updated_port = self._update_port( [ 713.505088] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 713.505088] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] _ensure_no_port_binding_failure(port) [ 713.505088] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 713.505088] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] raise exception.PortBindingFailed(port_id=port['id']) [ 713.505088] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] nova.exception.PortBindingFailed: Binding failed for port d075e52f-7ad7-43b6-8bcc-5b50c7a4ec75, please check neutron logs for more information. [ 713.505088] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] [ 713.505437] env[60164]: INFO nova.compute.manager [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] Terminating instance [ 713.507999] env[60164]: DEBUG oslo_concurrency.lockutils [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Acquiring lock "refresh_cache-c7cb800a-3634-44e4-bb18-fab9d2e86c7e" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 713.507999] env[60164]: DEBUG oslo_concurrency.lockutils [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Acquired lock "refresh_cache-c7cb800a-3634-44e4-bb18-fab9d2e86c7e" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 713.507999] env[60164]: DEBUG nova.network.neutron [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 713.546115] env[60164]: DEBUG nova.network.neutron [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 713.558802] env[60164]: DEBUG oslo_concurrency.lockutils [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] Releasing lock "refresh_cache-9e88b24c-500d-4efb-8563-093dd4d0378d" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 713.558802] env[60164]: DEBUG nova.compute.manager [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2984}} [ 713.558802] env[60164]: DEBUG nova.compute.manager [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 713.558802] env[60164]: DEBUG nova.network.neutron [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 713.573677] env[60164]: DEBUG nova.compute.manager [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] Start spawning the instance on the hypervisor. {{(pid=60164) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} [ 713.597170] env[60164]: DEBUG nova.network.neutron [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 713.604800] env[60164]: DEBUG nova.virt.hardware [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-15T19:01:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-15T19:01:11Z,direct_url=,disk_format='vmdk',id=1618eb55-f00d-42a5-b978-e81e57855fb4,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='4c19b57a5b764a6480c1f6e2a9cdd5e4',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-15T19:01:12Z,virtual_size=,visibility=), allow threads: False {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 713.605043] env[60164]: DEBUG nova.virt.hardware [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Flavor limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 713.605198] env[60164]: DEBUG nova.virt.hardware [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Image limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 713.605378] env[60164]: DEBUG nova.virt.hardware [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Flavor pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 713.605525] env[60164]: DEBUG nova.virt.hardware [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Image pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 713.605912] env[60164]: DEBUG nova.virt.hardware [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 713.605912] env[60164]: DEBUG nova.virt.hardware [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 713.606062] env[60164]: DEBUG nova.virt.hardware [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 713.606254] env[60164]: DEBUG nova.virt.hardware [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Got 1 possible topologies {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 713.607076] env[60164]: DEBUG nova.virt.hardware [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 713.607076] env[60164]: DEBUG nova.virt.hardware [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 713.607582] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3c8d54a-5515-4648-a0ad-9843f1fa8e22 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 713.617198] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2cdd6ac-dea2-4f91-ba23-31c18382e469 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 713.623015] env[60164]: DEBUG nova.network.neutron [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 713.637665] env[60164]: DEBUG nova.network.neutron [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 713.644681] env[60164]: INFO nova.compute.manager [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] [instance: 9e88b24c-500d-4efb-8563-093dd4d0378d] Took 0.09 seconds to deallocate network for instance. [ 713.716305] env[60164]: DEBUG nova.policy [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '37788e4056b84ab0b461767fad9e3955', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9af4fbea46444d81b8ed5dd844ce87d4', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60164) authorize /opt/stack/nova/nova/policy.py:203}} [ 713.739087] env[60164]: INFO nova.scheduler.client.report [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] Deleted allocations for instance 9e88b24c-500d-4efb-8563-093dd4d0378d [ 713.755932] env[60164]: DEBUG oslo_concurrency.lockutils [None req-c2451075-b3e5-46a7-bba0-62ebd8fee8f6 tempest-ServerMetadataNegativeTestJSON-885798172 tempest-ServerMetadataNegativeTestJSON-885798172-project-member] Lock "9e88b24c-500d-4efb-8563-093dd4d0378d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 14.366s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 713.780069] env[60164]: DEBUG nova.compute.manager [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] Starting instance... {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} [ 713.829577] env[60164]: DEBUG oslo_concurrency.lockutils [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 713.829845] env[60164]: DEBUG oslo_concurrency.lockutils [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 713.831388] env[60164]: INFO nova.compute.claims [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 714.111585] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec180067-df88-4b7c-b791-b2e98d9dc207 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.120545] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-863644a2-1f33-4815-bf7a-382d8dc0750e {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.157875] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e01d9a78-3b56-4795-b2f3-93464d6bac3a {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.166970] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0630531-ca9d-4d1d-b7d1-228813ca4f12 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.187194] env[60164]: DEBUG nova.compute.provider_tree [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 714.194986] env[60164]: DEBUG nova.scheduler.client.report [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 714.217158] env[60164]: DEBUG oslo_concurrency.lockutils [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.387s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 714.217791] env[60164]: DEBUG nova.compute.manager [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] Start building networks asynchronously for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} [ 714.243045] env[60164]: DEBUG nova.network.neutron [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 714.266703] env[60164]: DEBUG oslo_concurrency.lockutils [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Releasing lock "refresh_cache-c7cb800a-3634-44e4-bb18-fab9d2e86c7e" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 714.267142] env[60164]: DEBUG nova.compute.manager [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] Start destroying the instance on the hypervisor. {{(pid=60164) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} [ 714.267450] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] Destroying instance {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 714.267905] env[60164]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-28622651-d323-4578-b3ae-6c95971b0f9c {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.273169] env[60164]: DEBUG nova.compute.utils [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Using /dev/sd instead of None {{(pid=60164) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 714.276144] env[60164]: DEBUG nova.compute.manager [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] Allocating IP information in the background. {{(pid=60164) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 714.276248] env[60164]: DEBUG nova.network.neutron [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] allocate_for_instance() {{(pid=60164) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1143}} [ 714.287519] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11a2e547-8cfe-43fe-a414-7a6dc9a77373 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.304801] env[60164]: DEBUG nova.compute.manager [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] Start building block device mappings for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} [ 714.315098] env[60164]: WARNING nova.virt.vmwareapi.vmops [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance c7cb800a-3634-44e4-bb18-fab9d2e86c7e could not be found. [ 714.315098] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] Instance destroyed {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 714.315645] env[60164]: INFO nova.compute.manager [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] Took 0.05 seconds to destroy the instance on the hypervisor. [ 714.315869] env[60164]: DEBUG oslo.service.loopingcall [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60164) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 714.316312] env[60164]: DEBUG nova.compute.manager [-] [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 714.316441] env[60164]: DEBUG nova.network.neutron [-] [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 714.351696] env[60164]: DEBUG oslo_concurrency.lockutils [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] Acquiring lock "fc85402b-7fcc-4060-b16a-f82d70d6886b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 714.351932] env[60164]: DEBUG oslo_concurrency.lockutils [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] Lock "fc85402b-7fcc-4060-b16a-f82d70d6886b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 714.401539] env[60164]: DEBUG nova.network.neutron [-] [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 714.410607] env[60164]: DEBUG nova.compute.manager [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] Start spawning the instance on the hypervisor. {{(pid=60164) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} [ 714.411371] env[60164]: DEBUG nova.network.neutron [-] [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 714.422348] env[60164]: INFO nova.compute.manager [-] [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] Took 0.11 seconds to deallocate network for instance. [ 714.433732] env[60164]: DEBUG nova.compute.claims [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] Aborting claim: {{(pid=60164) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 714.433732] env[60164]: DEBUG oslo_concurrency.lockutils [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 714.433732] env[60164]: DEBUG oslo_concurrency.lockutils [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 714.447491] env[60164]: DEBUG nova.virt.hardware [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-15T19:01:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='84',id=12,is_public=True,memory_mb=192,name='m1.micro',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-15T19:01:11Z,direct_url=,disk_format='vmdk',id=1618eb55-f00d-42a5-b978-e81e57855fb4,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='4c19b57a5b764a6480c1f6e2a9cdd5e4',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-15T19:01:12Z,virtual_size=,visibility=), allow threads: False {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 714.447491] env[60164]: DEBUG nova.virt.hardware [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Flavor limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 714.447491] env[60164]: DEBUG nova.virt.hardware [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Image limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 714.447773] env[60164]: DEBUG nova.virt.hardware [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Flavor pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 714.447773] env[60164]: DEBUG nova.virt.hardware [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Image pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 714.447773] env[60164]: DEBUG nova.virt.hardware [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 714.448062] env[60164]: DEBUG nova.virt.hardware [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 714.448234] env[60164]: DEBUG nova.virt.hardware [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 714.448397] env[60164]: DEBUG nova.virt.hardware [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Got 1 possible topologies {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 714.448552] env[60164]: DEBUG nova.virt.hardware [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 714.448717] env[60164]: DEBUG nova.virt.hardware [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 714.449761] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16337333-465d-47da-89dc-2db701f19bb9 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.461314] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b9417c85-4a97-4618-86bb-332c41a3a555 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.523281] env[60164]: DEBUG nova.policy [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '37788e4056b84ab0b461767fad9e3955', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9af4fbea46444d81b8ed5dd844ce87d4', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60164) authorize /opt/stack/nova/nova/policy.py:203}} [ 714.734292] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69c63f68-70e3-4b0f-9829-04781dde9df3 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.741472] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa82c89e-9171-450a-b1d6-337045d44219 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.783548] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f78a179-d2b0-40f7-9ded-ecd215de168e {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.792267] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89b57dbb-b730-436e-9bc0-652a0c9e4497 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.810492] env[60164]: DEBUG nova.compute.provider_tree [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 714.825347] env[60164]: DEBUG nova.scheduler.client.report [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 714.844375] env[60164]: DEBUG oslo_concurrency.lockutils [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.411s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 714.845204] env[60164]: ERROR nova.compute.manager [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port d075e52f-7ad7-43b6-8bcc-5b50c7a4ec75, please check neutron logs for more information. [ 714.845204] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] Traceback (most recent call last): [ 714.845204] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 714.845204] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] self.driver.spawn(context, instance, image_meta, [ 714.845204] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 714.845204] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 714.845204] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 714.845204] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] vm_ref = self.build_virtual_machine(instance, [ 714.845204] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 714.845204] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] vif_infos = vmwarevif.get_vif_info(self._session, [ 714.845204] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 714.845533] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] for vif in network_info: [ 714.845533] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 714.845533] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] return self._sync_wrapper(fn, *args, **kwargs) [ 714.845533] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 714.845533] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] self.wait() [ 714.845533] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 714.845533] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] self[:] = self._gt.wait() [ 714.845533] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 714.845533] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] return self._exit_event.wait() [ 714.845533] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 714.845533] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] result = hub.switch() [ 714.845533] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 714.845533] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] return self.greenlet.switch() [ 714.845533] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 714.845894] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] result = function(*args, **kwargs) [ 714.845894] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 714.845894] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] return func(*args, **kwargs) [ 714.845894] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 714.845894] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] raise e [ 714.845894] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 714.845894] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] nwinfo = self.network_api.allocate_for_instance( [ 714.845894] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 714.845894] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] created_port_ids = self._update_ports_for_instance( [ 714.845894] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 714.845894] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] with excutils.save_and_reraise_exception(): [ 714.845894] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 714.845894] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] self.force_reraise() [ 714.846271] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 714.846271] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] raise self.value [ 714.846271] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 714.846271] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] updated_port = self._update_port( [ 714.846271] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 714.846271] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] _ensure_no_port_binding_failure(port) [ 714.846271] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 714.846271] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] raise exception.PortBindingFailed(port_id=port['id']) [ 714.846271] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] nova.exception.PortBindingFailed: Binding failed for port d075e52f-7ad7-43b6-8bcc-5b50c7a4ec75, please check neutron logs for more information. [ 714.846271] env[60164]: ERROR nova.compute.manager [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] [ 714.846271] env[60164]: DEBUG nova.compute.utils [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] Binding failed for port d075e52f-7ad7-43b6-8bcc-5b50c7a4ec75, please check neutron logs for more information. {{(pid=60164) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 714.848035] env[60164]: DEBUG nova.compute.manager [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] Build of instance c7cb800a-3634-44e4-bb18-fab9d2e86c7e was re-scheduled: Binding failed for port d075e52f-7ad7-43b6-8bcc-5b50c7a4ec75, please check neutron logs for more information. {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 714.848035] env[60164]: DEBUG nova.compute.manager [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] Unplugging VIFs for instance {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} [ 714.848242] env[60164]: DEBUG oslo_concurrency.lockutils [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Acquiring lock "refresh_cache-c7cb800a-3634-44e4-bb18-fab9d2e86c7e" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 714.848378] env[60164]: DEBUG oslo_concurrency.lockutils [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Acquired lock "refresh_cache-c7cb800a-3634-44e4-bb18-fab9d2e86c7e" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 714.848527] env[60164]: DEBUG nova.network.neutron [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 714.978819] env[60164]: DEBUG oslo_concurrency.lockutils [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Acquiring lock "e75afc9c-035c-4926-b72a-d570b5f2e6f0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 714.978819] env[60164]: DEBUG oslo_concurrency.lockutils [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Lock "e75afc9c-035c-4926-b72a-d570b5f2e6f0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 715.014933] env[60164]: DEBUG nova.network.neutron [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 715.789020] env[60164]: DEBUG nova.network.neutron [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] Successfully created port: 21decdcc-d46e-4851-8eee-8a89912b5691 {{(pid=60164) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 715.803632] env[60164]: DEBUG nova.network.neutron [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 715.818300] env[60164]: DEBUG oslo_concurrency.lockutils [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Releasing lock "refresh_cache-c7cb800a-3634-44e4-bb18-fab9d2e86c7e" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 715.818902] env[60164]: DEBUG nova.compute.manager [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2984}} [ 715.819244] env[60164]: DEBUG nova.compute.manager [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 715.819603] env[60164]: DEBUG nova.network.neutron [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 715.889031] env[60164]: DEBUG nova.network.neutron [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 715.892694] env[60164]: ERROR nova.compute.manager [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 0e04edf6-7d03-4368-98ac-203be2fde2ed, please check neutron logs for more information. [ 715.892694] env[60164]: ERROR nova.compute.manager Traceback (most recent call last): [ 715.892694] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 715.892694] env[60164]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 715.892694] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 715.892694] env[60164]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 715.892694] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 715.892694] env[60164]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 715.892694] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 715.892694] env[60164]: ERROR nova.compute.manager self.force_reraise() [ 715.892694] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 715.892694] env[60164]: ERROR nova.compute.manager raise self.value [ 715.892694] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 715.892694] env[60164]: ERROR nova.compute.manager updated_port = self._update_port( [ 715.892694] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 715.892694] env[60164]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 715.893258] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 715.893258] env[60164]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 715.893258] env[60164]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 0e04edf6-7d03-4368-98ac-203be2fde2ed, please check neutron logs for more information. [ 715.893258] env[60164]: ERROR nova.compute.manager [ 715.893258] env[60164]: Traceback (most recent call last): [ 715.893258] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 715.893258] env[60164]: listener.cb(fileno) [ 715.893258] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 715.893258] env[60164]: result = function(*args, **kwargs) [ 715.893258] env[60164]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 715.893258] env[60164]: return func(*args, **kwargs) [ 715.893258] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 715.893258] env[60164]: raise e [ 715.893258] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 715.893258] env[60164]: nwinfo = self.network_api.allocate_for_instance( [ 715.893258] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 715.893258] env[60164]: created_port_ids = self._update_ports_for_instance( [ 715.893258] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 715.893258] env[60164]: with excutils.save_and_reraise_exception(): [ 715.893258] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 715.893258] env[60164]: self.force_reraise() [ 715.893258] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 715.893258] env[60164]: raise self.value [ 715.893258] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 715.893258] env[60164]: updated_port = self._update_port( [ 715.893258] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 715.893258] env[60164]: _ensure_no_port_binding_failure(port) [ 715.893258] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 715.893258] env[60164]: raise exception.PortBindingFailed(port_id=port['id']) [ 715.894412] env[60164]: nova.exception.PortBindingFailed: Binding failed for port 0e04edf6-7d03-4368-98ac-203be2fde2ed, please check neutron logs for more information. [ 715.894412] env[60164]: Removing descriptor: 12 [ 715.895791] env[60164]: ERROR nova.compute.manager [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 8fcf260d-2796-4972-b217-95954e309a6e] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 0e04edf6-7d03-4368-98ac-203be2fde2ed, please check neutron logs for more information. [ 715.895791] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] Traceback (most recent call last): [ 715.895791] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] File "/opt/stack/nova/nova/compute/manager.py", line 2849, in _build_resources [ 715.895791] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] yield resources [ 715.895791] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 715.895791] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] self.driver.spawn(context, instance, image_meta, [ 715.895791] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 715.895791] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 715.895791] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 715.895791] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] vm_ref = self.build_virtual_machine(instance, [ 715.895791] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 715.896363] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] vif_infos = vmwarevif.get_vif_info(self._session, [ 715.896363] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 715.896363] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] for vif in network_info: [ 715.896363] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 715.896363] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] return self._sync_wrapper(fn, *args, **kwargs) [ 715.896363] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 715.896363] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] self.wait() [ 715.896363] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 715.896363] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] self[:] = self._gt.wait() [ 715.896363] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 715.896363] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] return self._exit_event.wait() [ 715.896363] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 715.896363] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] result = hub.switch() [ 715.896363] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 715.896930] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] return self.greenlet.switch() [ 715.896930] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 715.896930] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] result = function(*args, **kwargs) [ 715.896930] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 715.896930] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] return func(*args, **kwargs) [ 715.896930] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 715.896930] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] raise e [ 715.896930] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 715.896930] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] nwinfo = self.network_api.allocate_for_instance( [ 715.896930] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 715.896930] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] created_port_ids = self._update_ports_for_instance( [ 715.896930] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 715.896930] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] with excutils.save_and_reraise_exception(): [ 715.897482] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 715.897482] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] self.force_reraise() [ 715.897482] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 715.897482] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] raise self.value [ 715.897482] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 715.897482] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] updated_port = self._update_port( [ 715.897482] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 715.897482] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] _ensure_no_port_binding_failure(port) [ 715.897482] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 715.897482] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] raise exception.PortBindingFailed(port_id=port['id']) [ 715.897482] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] nova.exception.PortBindingFailed: Binding failed for port 0e04edf6-7d03-4368-98ac-203be2fde2ed, please check neutron logs for more information. [ 715.897482] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] [ 715.897835] env[60164]: INFO nova.compute.manager [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 8fcf260d-2796-4972-b217-95954e309a6e] Terminating instance [ 715.899472] env[60164]: DEBUG oslo_concurrency.lockutils [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Acquiring lock "refresh_cache-8fcf260d-2796-4972-b217-95954e309a6e" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 715.899472] env[60164]: DEBUG oslo_concurrency.lockutils [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Acquired lock "refresh_cache-8fcf260d-2796-4972-b217-95954e309a6e" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 715.899472] env[60164]: DEBUG nova.network.neutron [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 8fcf260d-2796-4972-b217-95954e309a6e] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 715.908124] env[60164]: DEBUG nova.network.neutron [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 715.918393] env[60164]: INFO nova.compute.manager [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: c7cb800a-3634-44e4-bb18-fab9d2e86c7e] Took 0.10 seconds to deallocate network for instance. [ 715.973056] env[60164]: DEBUG nova.network.neutron [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 8fcf260d-2796-4972-b217-95954e309a6e] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 716.035375] env[60164]: INFO nova.scheduler.client.report [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Deleted allocations for instance c7cb800a-3634-44e4-bb18-fab9d2e86c7e [ 716.054496] env[60164]: DEBUG oslo_concurrency.lockutils [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Lock "c7cb800a-3634-44e4-bb18-fab9d2e86c7e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 12.720s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 716.074843] env[60164]: DEBUG nova.compute.manager [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] Starting instance... {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} [ 716.127451] env[60164]: DEBUG nova.compute.manager [req-11ba8a4a-38f9-42c8-ad49-146dd92dcae2 req-902318fb-3436-4240-8f00-e01223163a60 service nova] [instance: 8fcf260d-2796-4972-b217-95954e309a6e] Received event network-changed-0e04edf6-7d03-4368-98ac-203be2fde2ed {{(pid=60164) external_instance_event /opt/stack/nova/nova/compute/manager.py:10979}} [ 716.127705] env[60164]: DEBUG nova.compute.manager [req-11ba8a4a-38f9-42c8-ad49-146dd92dcae2 req-902318fb-3436-4240-8f00-e01223163a60 service nova] [instance: 8fcf260d-2796-4972-b217-95954e309a6e] Refreshing instance network info cache due to event network-changed-0e04edf6-7d03-4368-98ac-203be2fde2ed. {{(pid=60164) external_instance_event /opt/stack/nova/nova/compute/manager.py:10984}} [ 716.127893] env[60164]: DEBUG oslo_concurrency.lockutils [req-11ba8a4a-38f9-42c8-ad49-146dd92dcae2 req-902318fb-3436-4240-8f00-e01223163a60 service nova] Acquiring lock "refresh_cache-8fcf260d-2796-4972-b217-95954e309a6e" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 716.136958] env[60164]: DEBUG oslo_concurrency.lockutils [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 716.137453] env[60164]: DEBUG oslo_concurrency.lockutils [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 716.142560] env[60164]: INFO nova.compute.claims [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 716.322832] env[60164]: DEBUG nova.network.neutron [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] Successfully created port: db9c5b4c-7f66-4453-85b1-d47606e0a329 {{(pid=60164) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 716.407351] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67361302-4609-43d1-8909-62fa77d36060 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.415233] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bbf77e02-2eb3-40a7-9812-becfcfbfb1cd {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.448805] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af6fa5ae-df47-4359-83d6-a148c41ff88b {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.456723] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad63e3c8-cd08-4696-97b8-facbee238898 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.472621] env[60164]: DEBUG nova.compute.provider_tree [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 716.482571] env[60164]: DEBUG nova.scheduler.client.report [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 716.501668] env[60164]: DEBUG oslo_concurrency.lockutils [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.364s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 716.502266] env[60164]: DEBUG nova.compute.manager [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] Start building networks asynchronously for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} [ 716.539719] env[60164]: DEBUG nova.compute.utils [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Using /dev/sd instead of None {{(pid=60164) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 716.541297] env[60164]: DEBUG nova.compute.manager [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] Allocating IP information in the background. {{(pid=60164) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 716.541868] env[60164]: DEBUG nova.network.neutron [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] allocate_for_instance() {{(pid=60164) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1143}} [ 716.551573] env[60164]: DEBUG nova.compute.manager [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] Start building block device mappings for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} [ 716.615618] env[60164]: DEBUG nova.compute.manager [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] Start spawning the instance on the hypervisor. {{(pid=60164) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} [ 716.638333] env[60164]: DEBUG nova.virt.hardware [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-15T19:01:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-15T19:01:11Z,direct_url=,disk_format='vmdk',id=1618eb55-f00d-42a5-b978-e81e57855fb4,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='4c19b57a5b764a6480c1f6e2a9cdd5e4',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-15T19:01:12Z,virtual_size=,visibility=), allow threads: False {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 716.638789] env[60164]: DEBUG nova.virt.hardware [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Flavor limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 716.639084] env[60164]: DEBUG nova.virt.hardware [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Image limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 716.639845] env[60164]: DEBUG nova.virt.hardware [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Flavor pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 716.639845] env[60164]: DEBUG nova.virt.hardware [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Image pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 716.639845] env[60164]: DEBUG nova.virt.hardware [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 716.640143] env[60164]: DEBUG nova.virt.hardware [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 716.640400] env[60164]: DEBUG nova.virt.hardware [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 716.640666] env[60164]: DEBUG nova.virt.hardware [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Got 1 possible topologies {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 716.640947] env[60164]: DEBUG nova.virt.hardware [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 716.643017] env[60164]: DEBUG nova.virt.hardware [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 716.643017] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4b74c30-2216-4473-b4e5-0a81c8bb9004 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.650892] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dfa7f14d-cd69-4a91-8937-ef4c44ce9403 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.741997] env[60164]: DEBUG nova.network.neutron [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 8fcf260d-2796-4972-b217-95954e309a6e] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 716.753043] env[60164]: DEBUG oslo_concurrency.lockutils [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Releasing lock "refresh_cache-8fcf260d-2796-4972-b217-95954e309a6e" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 716.753043] env[60164]: DEBUG nova.compute.manager [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 8fcf260d-2796-4972-b217-95954e309a6e] Start destroying the instance on the hypervisor. {{(pid=60164) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} [ 716.753043] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 8fcf260d-2796-4972-b217-95954e309a6e] Destroying instance {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 716.753043] env[60164]: DEBUG oslo_concurrency.lockutils [req-11ba8a4a-38f9-42c8-ad49-146dd92dcae2 req-902318fb-3436-4240-8f00-e01223163a60 service nova] Acquired lock "refresh_cache-8fcf260d-2796-4972-b217-95954e309a6e" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 716.753043] env[60164]: DEBUG nova.network.neutron [req-11ba8a4a-38f9-42c8-ad49-146dd92dcae2 req-902318fb-3436-4240-8f00-e01223163a60 service nova] [instance: 8fcf260d-2796-4972-b217-95954e309a6e] Refreshing network info cache for port 0e04edf6-7d03-4368-98ac-203be2fde2ed {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1986}} [ 716.753329] env[60164]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-164715d2-767f-4bfd-9a06-9e16c75bc1b4 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.766515] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9154c420-989b-4261-85d0-4fbfee62ac70 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.791760] env[60164]: WARNING nova.virt.vmwareapi.vmops [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 8fcf260d-2796-4972-b217-95954e309a6e] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 8fcf260d-2796-4972-b217-95954e309a6e could not be found. [ 716.794968] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 8fcf260d-2796-4972-b217-95954e309a6e] Instance destroyed {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 716.794968] env[60164]: INFO nova.compute.manager [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 8fcf260d-2796-4972-b217-95954e309a6e] Took 0.04 seconds to destroy the instance on the hypervisor. [ 716.794968] env[60164]: DEBUG oslo.service.loopingcall [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60164) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 716.794968] env[60164]: DEBUG nova.compute.manager [-] [instance: 8fcf260d-2796-4972-b217-95954e309a6e] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 716.794968] env[60164]: DEBUG nova.network.neutron [-] [instance: 8fcf260d-2796-4972-b217-95954e309a6e] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 716.863649] env[60164]: DEBUG nova.network.neutron [req-11ba8a4a-38f9-42c8-ad49-146dd92dcae2 req-902318fb-3436-4240-8f00-e01223163a60 service nova] [instance: 8fcf260d-2796-4972-b217-95954e309a6e] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 716.953505] env[60164]: DEBUG nova.network.neutron [-] [instance: 8fcf260d-2796-4972-b217-95954e309a6e] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 716.964042] env[60164]: DEBUG nova.network.neutron [-] [instance: 8fcf260d-2796-4972-b217-95954e309a6e] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 716.973246] env[60164]: INFO nova.compute.manager [-] [instance: 8fcf260d-2796-4972-b217-95954e309a6e] Took 0.18 seconds to deallocate network for instance. [ 716.976393] env[60164]: DEBUG nova.compute.claims [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 8fcf260d-2796-4972-b217-95954e309a6e] Aborting claim: {{(pid=60164) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 716.976786] env[60164]: DEBUG oslo_concurrency.lockutils [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 716.977019] env[60164]: DEBUG oslo_concurrency.lockutils [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 717.049289] env[60164]: DEBUG nova.policy [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f7221f06a45d45f2a34ab3bdd869113d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8d75debce2fd4b2492cc02aeb2fed7fc', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60164) authorize /opt/stack/nova/nova/policy.py:203}} [ 717.206893] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f38c662-2b57-48f1-bb1f-8f503f9f6234 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.218336] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a592a6d8-40aa-4d82-b740-8e92064a053e {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.254545] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a7c7dc6-c8b5-4e38-8635-30780088b4a5 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.263809] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9981feb0-9634-489a-b0c5-3c92deb4262f {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.279500] env[60164]: DEBUG nova.compute.provider_tree [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 717.289284] env[60164]: DEBUG nova.scheduler.client.report [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 717.303917] env[60164]: DEBUG oslo_concurrency.lockutils [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.327s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 717.304605] env[60164]: ERROR nova.compute.manager [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 8fcf260d-2796-4972-b217-95954e309a6e] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 0e04edf6-7d03-4368-98ac-203be2fde2ed, please check neutron logs for more information. [ 717.304605] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] Traceback (most recent call last): [ 717.304605] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 717.304605] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] self.driver.spawn(context, instance, image_meta, [ 717.304605] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 717.304605] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 717.304605] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 717.304605] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] vm_ref = self.build_virtual_machine(instance, [ 717.304605] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 717.304605] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] vif_infos = vmwarevif.get_vif_info(self._session, [ 717.304605] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 717.304995] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] for vif in network_info: [ 717.304995] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 717.304995] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] return self._sync_wrapper(fn, *args, **kwargs) [ 717.304995] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 717.304995] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] self.wait() [ 717.304995] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 717.304995] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] self[:] = self._gt.wait() [ 717.304995] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 717.304995] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] return self._exit_event.wait() [ 717.304995] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 717.304995] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] result = hub.switch() [ 717.304995] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 717.304995] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] return self.greenlet.switch() [ 717.304995] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 717.305380] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] result = function(*args, **kwargs) [ 717.305380] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 717.305380] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] return func(*args, **kwargs) [ 717.305380] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 717.305380] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] raise e [ 717.305380] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 717.305380] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] nwinfo = self.network_api.allocate_for_instance( [ 717.305380] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 717.305380] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] created_port_ids = self._update_ports_for_instance( [ 717.305380] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 717.305380] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] with excutils.save_and_reraise_exception(): [ 717.305380] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 717.305380] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] self.force_reraise() [ 717.305749] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 717.305749] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] raise self.value [ 717.305749] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 717.305749] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] updated_port = self._update_port( [ 717.305749] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 717.305749] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] _ensure_no_port_binding_failure(port) [ 717.305749] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 717.305749] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] raise exception.PortBindingFailed(port_id=port['id']) [ 717.305749] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] nova.exception.PortBindingFailed: Binding failed for port 0e04edf6-7d03-4368-98ac-203be2fde2ed, please check neutron logs for more information. [ 717.305749] env[60164]: ERROR nova.compute.manager [instance: 8fcf260d-2796-4972-b217-95954e309a6e] [ 717.305749] env[60164]: DEBUG nova.compute.utils [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 8fcf260d-2796-4972-b217-95954e309a6e] Binding failed for port 0e04edf6-7d03-4368-98ac-203be2fde2ed, please check neutron logs for more information. {{(pid=60164) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 717.311024] env[60164]: DEBUG nova.compute.manager [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 8fcf260d-2796-4972-b217-95954e309a6e] Build of instance 8fcf260d-2796-4972-b217-95954e309a6e was re-scheduled: Binding failed for port 0e04edf6-7d03-4368-98ac-203be2fde2ed, please check neutron logs for more information. {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 717.311024] env[60164]: DEBUG nova.compute.manager [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 8fcf260d-2796-4972-b217-95954e309a6e] Unplugging VIFs for instance {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} [ 717.311024] env[60164]: DEBUG oslo_concurrency.lockutils [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Acquiring lock "refresh_cache-8fcf260d-2796-4972-b217-95954e309a6e" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 717.512519] env[60164]: DEBUG nova.network.neutron [req-11ba8a4a-38f9-42c8-ad49-146dd92dcae2 req-902318fb-3436-4240-8f00-e01223163a60 service nova] [instance: 8fcf260d-2796-4972-b217-95954e309a6e] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 717.513864] env[60164]: ERROR nova.compute.manager [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 6e276e45-fe27-414d-ba81-a3de27e5773a, please check neutron logs for more information. [ 717.513864] env[60164]: ERROR nova.compute.manager Traceback (most recent call last): [ 717.513864] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 717.513864] env[60164]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 717.513864] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 717.513864] env[60164]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 717.513864] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 717.513864] env[60164]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 717.513864] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 717.513864] env[60164]: ERROR nova.compute.manager self.force_reraise() [ 717.513864] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 717.513864] env[60164]: ERROR nova.compute.manager raise self.value [ 717.513864] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 717.513864] env[60164]: ERROR nova.compute.manager updated_port = self._update_port( [ 717.513864] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 717.513864] env[60164]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 717.514372] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 717.514372] env[60164]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 717.514372] env[60164]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 6e276e45-fe27-414d-ba81-a3de27e5773a, please check neutron logs for more information. [ 717.514372] env[60164]: ERROR nova.compute.manager [ 717.514372] env[60164]: Traceback (most recent call last): [ 717.514372] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 717.514372] env[60164]: listener.cb(fileno) [ 717.514372] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 717.514372] env[60164]: result = function(*args, **kwargs) [ 717.514372] env[60164]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 717.514372] env[60164]: return func(*args, **kwargs) [ 717.514372] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 717.514372] env[60164]: raise e [ 717.514372] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 717.514372] env[60164]: nwinfo = self.network_api.allocate_for_instance( [ 717.514372] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 717.514372] env[60164]: created_port_ids = self._update_ports_for_instance( [ 717.514372] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 717.514372] env[60164]: with excutils.save_and_reraise_exception(): [ 717.514372] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 717.514372] env[60164]: self.force_reraise() [ 717.514372] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 717.514372] env[60164]: raise self.value [ 717.514372] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 717.514372] env[60164]: updated_port = self._update_port( [ 717.514372] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 717.514372] env[60164]: _ensure_no_port_binding_failure(port) [ 717.514372] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 717.514372] env[60164]: raise exception.PortBindingFailed(port_id=port['id']) [ 717.515234] env[60164]: nova.exception.PortBindingFailed: Binding failed for port 6e276e45-fe27-414d-ba81-a3de27e5773a, please check neutron logs for more information. [ 717.515234] env[60164]: Removing descriptor: 17 [ 717.515234] env[60164]: ERROR nova.compute.manager [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 6e276e45-fe27-414d-ba81-a3de27e5773a, please check neutron logs for more information. [ 717.515234] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] Traceback (most recent call last): [ 717.515234] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] File "/opt/stack/nova/nova/compute/manager.py", line 2849, in _build_resources [ 717.515234] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] yield resources [ 717.515234] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 717.515234] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] self.driver.spawn(context, instance, image_meta, [ 717.515234] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 717.515234] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] self._vmops.spawn(context, instance, image_meta, injected_files, [ 717.515234] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 717.515234] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] vm_ref = self.build_virtual_machine(instance, [ 717.515583] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 717.515583] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] vif_infos = vmwarevif.get_vif_info(self._session, [ 717.515583] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 717.515583] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] for vif in network_info: [ 717.515583] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 717.515583] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] return self._sync_wrapper(fn, *args, **kwargs) [ 717.515583] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 717.515583] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] self.wait() [ 717.515583] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 717.515583] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] self[:] = self._gt.wait() [ 717.515583] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 717.515583] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] return self._exit_event.wait() [ 717.515583] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 717.515583] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] result = hub.switch() [ 717.515931] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 717.515931] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] return self.greenlet.switch() [ 717.515931] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 717.515931] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] result = function(*args, **kwargs) [ 717.515931] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 717.515931] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] return func(*args, **kwargs) [ 717.515931] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 717.515931] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] raise e [ 717.515931] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 717.515931] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] nwinfo = self.network_api.allocate_for_instance( [ 717.515931] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 717.515931] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] created_port_ids = self._update_ports_for_instance( [ 717.515931] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 717.516285] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] with excutils.save_and_reraise_exception(): [ 717.516285] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 717.516285] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] self.force_reraise() [ 717.516285] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 717.516285] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] raise self.value [ 717.516285] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 717.516285] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] updated_port = self._update_port( [ 717.516285] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 717.516285] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] _ensure_no_port_binding_failure(port) [ 717.516285] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 717.516285] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] raise exception.PortBindingFailed(port_id=port['id']) [ 717.516285] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] nova.exception.PortBindingFailed: Binding failed for port 6e276e45-fe27-414d-ba81-a3de27e5773a, please check neutron logs for more information. [ 717.516285] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] [ 717.516597] env[60164]: INFO nova.compute.manager [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] Terminating instance [ 717.518235] env[60164]: DEBUG oslo_concurrency.lockutils [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Acquiring lock "refresh_cache-c9c2d371-978e-4037-ba78-9b44f40765bd" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 717.518404] env[60164]: DEBUG oslo_concurrency.lockutils [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Acquired lock "refresh_cache-c9c2d371-978e-4037-ba78-9b44f40765bd" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 717.519022] env[60164]: DEBUG nova.network.neutron [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 717.524349] env[60164]: DEBUG oslo_concurrency.lockutils [req-11ba8a4a-38f9-42c8-ad49-146dd92dcae2 req-902318fb-3436-4240-8f00-e01223163a60 service nova] Releasing lock "refresh_cache-8fcf260d-2796-4972-b217-95954e309a6e" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 717.524737] env[60164]: DEBUG oslo_concurrency.lockutils [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Acquired lock "refresh_cache-8fcf260d-2796-4972-b217-95954e309a6e" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 717.524914] env[60164]: DEBUG nova.network.neutron [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 8fcf260d-2796-4972-b217-95954e309a6e] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 717.603492] env[60164]: DEBUG nova.network.neutron [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 717.875630] env[60164]: DEBUG oslo_concurrency.lockutils [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Acquiring lock "47d86b97-4bf1-40d4-b666-a530901d28dd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 717.875914] env[60164]: DEBUG oslo_concurrency.lockutils [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Lock "47d86b97-4bf1-40d4-b666-a530901d28dd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 717.886991] env[60164]: DEBUG nova.network.neutron [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 8fcf260d-2796-4972-b217-95954e309a6e] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 718.106671] env[60164]: DEBUG nova.compute.manager [req-eacd423b-b30c-4e24-84b9-d67da5ac13b6 req-802f7acb-3325-4f3c-a42b-8a4173ab1eb1 service nova] [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] Received event network-changed-6e276e45-fe27-414d-ba81-a3de27e5773a {{(pid=60164) external_instance_event /opt/stack/nova/nova/compute/manager.py:10979}} [ 718.106904] env[60164]: DEBUG nova.compute.manager [req-eacd423b-b30c-4e24-84b9-d67da5ac13b6 req-802f7acb-3325-4f3c-a42b-8a4173ab1eb1 service nova] [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] Refreshing instance network info cache due to event network-changed-6e276e45-fe27-414d-ba81-a3de27e5773a. {{(pid=60164) external_instance_event /opt/stack/nova/nova/compute/manager.py:10984}} [ 718.109616] env[60164]: DEBUG oslo_concurrency.lockutils [req-eacd423b-b30c-4e24-84b9-d67da5ac13b6 req-802f7acb-3325-4f3c-a42b-8a4173ab1eb1 service nova] Acquiring lock "refresh_cache-c9c2d371-978e-4037-ba78-9b44f40765bd" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 718.424318] env[60164]: DEBUG nova.network.neutron [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 718.434546] env[60164]: DEBUG oslo_concurrency.lockutils [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Releasing lock "refresh_cache-c9c2d371-978e-4037-ba78-9b44f40765bd" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 718.435056] env[60164]: DEBUG nova.compute.manager [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] Start destroying the instance on the hypervisor. {{(pid=60164) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} [ 718.435327] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] Destroying instance {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 718.435939] env[60164]: DEBUG oslo_concurrency.lockutils [req-eacd423b-b30c-4e24-84b9-d67da5ac13b6 req-802f7acb-3325-4f3c-a42b-8a4173ab1eb1 service nova] Acquired lock "refresh_cache-c9c2d371-978e-4037-ba78-9b44f40765bd" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 718.436196] env[60164]: DEBUG nova.network.neutron [req-eacd423b-b30c-4e24-84b9-d67da5ac13b6 req-802f7acb-3325-4f3c-a42b-8a4173ab1eb1 service nova] [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] Refreshing network info cache for port 6e276e45-fe27-414d-ba81-a3de27e5773a {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1986}} [ 718.438579] env[60164]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-a3c2dfb6-baf6-4c58-aedc-bef97c399376 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.458546] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-956134d8-0f68-4742-a448-752389001d49 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.486563] env[60164]: WARNING nova.virt.vmwareapi.vmops [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance c9c2d371-978e-4037-ba78-9b44f40765bd could not be found. [ 718.487622] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] Instance destroyed {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 718.487622] env[60164]: INFO nova.compute.manager [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] Took 0.05 seconds to destroy the instance on the hypervisor. [ 718.487906] env[60164]: DEBUG oslo.service.loopingcall [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60164) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 718.488470] env[60164]: DEBUG nova.compute.manager [-] [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 718.488654] env[60164]: DEBUG nova.network.neutron [-] [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 718.530560] env[60164]: DEBUG nova.network.neutron [req-eacd423b-b30c-4e24-84b9-d67da5ac13b6 req-802f7acb-3325-4f3c-a42b-8a4173ab1eb1 service nova] [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 718.592905] env[60164]: DEBUG nova.network.neutron [-] [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 718.600369] env[60164]: DEBUG nova.network.neutron [-] [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 718.609433] env[60164]: INFO nova.compute.manager [-] [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] Took 0.12 seconds to deallocate network for instance. [ 718.611629] env[60164]: DEBUG nova.compute.claims [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] Aborting claim: {{(pid=60164) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 718.611769] env[60164]: DEBUG oslo_concurrency.lockutils [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 718.611960] env[60164]: DEBUG oslo_concurrency.lockutils [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 718.621998] env[60164]: DEBUG nova.network.neutron [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 8fcf260d-2796-4972-b217-95954e309a6e] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 718.629655] env[60164]: DEBUG oslo_concurrency.lockutils [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Releasing lock "refresh_cache-8fcf260d-2796-4972-b217-95954e309a6e" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 718.629655] env[60164]: DEBUG nova.compute.manager [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2984}} [ 718.629655] env[60164]: DEBUG nova.compute.manager [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 8fcf260d-2796-4972-b217-95954e309a6e] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 718.629655] env[60164]: DEBUG nova.network.neutron [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 8fcf260d-2796-4972-b217-95954e309a6e] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 718.714911] env[60164]: DEBUG nova.network.neutron [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 8fcf260d-2796-4972-b217-95954e309a6e] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 718.722708] env[60164]: DEBUG nova.network.neutron [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 8fcf260d-2796-4972-b217-95954e309a6e] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 718.733931] env[60164]: INFO nova.compute.manager [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 8fcf260d-2796-4972-b217-95954e309a6e] Took 0.11 seconds to deallocate network for instance. [ 718.840394] env[60164]: INFO nova.scheduler.client.report [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Deleted allocations for instance 8fcf260d-2796-4972-b217-95954e309a6e [ 718.858665] env[60164]: DEBUG oslo_concurrency.lockutils [None req-0e92c309-cbac-45de-8e98-2aaf7268129d tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Lock "8fcf260d-2796-4972-b217-95954e309a6e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 15.487s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 718.873082] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32389dd5-55dc-4608-8031-69dac136d414 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.878220] env[60164]: DEBUG nova.compute.manager [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] [instance: b01c69b3-eec6-4577-8288-d4602da9e251] Starting instance... {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} [ 718.883594] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b69b270-00f8-4aed-9b6d-530c9018b6cf {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.917686] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2bc5999b-464b-4066-8b17-dc02157dba7b {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.929452] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-caa04ed7-f0c7-4808-8309-76ff15da9497 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.943811] env[60164]: DEBUG nova.compute.provider_tree [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 718.948023] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 718.954551] env[60164]: DEBUG nova.scheduler.client.report [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 718.968513] env[60164]: DEBUG oslo_concurrency.lockutils [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.356s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 718.968687] env[60164]: ERROR nova.compute.manager [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 6e276e45-fe27-414d-ba81-a3de27e5773a, please check neutron logs for more information. [ 718.968687] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] Traceback (most recent call last): [ 718.968687] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 718.968687] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] self.driver.spawn(context, instance, image_meta, [ 718.968687] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 718.968687] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] self._vmops.spawn(context, instance, image_meta, injected_files, [ 718.968687] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 718.968687] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] vm_ref = self.build_virtual_machine(instance, [ 718.968687] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 718.968687] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] vif_infos = vmwarevif.get_vif_info(self._session, [ 718.968687] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 718.969202] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] for vif in network_info: [ 718.969202] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 718.969202] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] return self._sync_wrapper(fn, *args, **kwargs) [ 718.969202] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 718.969202] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] self.wait() [ 718.969202] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 718.969202] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] self[:] = self._gt.wait() [ 718.969202] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 718.969202] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] return self._exit_event.wait() [ 718.969202] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 718.969202] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] result = hub.switch() [ 718.969202] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 718.969202] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] return self.greenlet.switch() [ 718.969202] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 718.969808] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] result = function(*args, **kwargs) [ 718.969808] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 718.969808] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] return func(*args, **kwargs) [ 718.969808] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 718.969808] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] raise e [ 718.969808] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 718.969808] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] nwinfo = self.network_api.allocate_for_instance( [ 718.969808] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 718.969808] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] created_port_ids = self._update_ports_for_instance( [ 718.969808] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 718.969808] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] with excutils.save_and_reraise_exception(): [ 718.969808] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 718.969808] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] self.force_reraise() [ 718.970403] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 718.970403] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] raise self.value [ 718.970403] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 718.970403] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] updated_port = self._update_port( [ 718.970403] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 718.970403] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] _ensure_no_port_binding_failure(port) [ 718.970403] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 718.970403] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] raise exception.PortBindingFailed(port_id=port['id']) [ 718.970403] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] nova.exception.PortBindingFailed: Binding failed for port 6e276e45-fe27-414d-ba81-a3de27e5773a, please check neutron logs for more information. [ 718.970403] env[60164]: ERROR nova.compute.manager [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] [ 718.970403] env[60164]: DEBUG nova.compute.utils [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] Binding failed for port 6e276e45-fe27-414d-ba81-a3de27e5773a, please check neutron logs for more information. {{(pid=60164) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 718.970866] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.025s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 718.972209] env[60164]: INFO nova.compute.claims [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] [instance: b01c69b3-eec6-4577-8288-d4602da9e251] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 718.975081] env[60164]: DEBUG nova.compute.manager [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] Build of instance c9c2d371-978e-4037-ba78-9b44f40765bd was re-scheduled: Binding failed for port 6e276e45-fe27-414d-ba81-a3de27e5773a, please check neutron logs for more information. {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 718.978119] env[60164]: DEBUG nova.compute.manager [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] Unplugging VIFs for instance {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} [ 718.978119] env[60164]: DEBUG oslo_concurrency.lockutils [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Acquiring lock "refresh_cache-c9c2d371-978e-4037-ba78-9b44f40765bd" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 719.162895] env[60164]: ERROR nova.compute.manager [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 22a86f22-b09e-42d4-94fe-94f6c03a4a0b, please check neutron logs for more information. [ 719.162895] env[60164]: ERROR nova.compute.manager Traceback (most recent call last): [ 719.162895] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 719.162895] env[60164]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 719.162895] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 719.162895] env[60164]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 719.162895] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 719.162895] env[60164]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 719.162895] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 719.162895] env[60164]: ERROR nova.compute.manager self.force_reraise() [ 719.162895] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 719.162895] env[60164]: ERROR nova.compute.manager raise self.value [ 719.162895] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 719.162895] env[60164]: ERROR nova.compute.manager updated_port = self._update_port( [ 719.162895] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 719.162895] env[60164]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 719.163784] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 719.163784] env[60164]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 719.163784] env[60164]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 22a86f22-b09e-42d4-94fe-94f6c03a4a0b, please check neutron logs for more information. [ 719.163784] env[60164]: ERROR nova.compute.manager [ 719.163784] env[60164]: Traceback (most recent call last): [ 719.163784] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 719.163784] env[60164]: listener.cb(fileno) [ 719.163784] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 719.163784] env[60164]: result = function(*args, **kwargs) [ 719.163784] env[60164]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 719.163784] env[60164]: return func(*args, **kwargs) [ 719.163784] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 719.163784] env[60164]: raise e [ 719.163784] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 719.163784] env[60164]: nwinfo = self.network_api.allocate_for_instance( [ 719.163784] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 719.163784] env[60164]: created_port_ids = self._update_ports_for_instance( [ 719.163784] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 719.163784] env[60164]: with excutils.save_and_reraise_exception(): [ 719.163784] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 719.163784] env[60164]: self.force_reraise() [ 719.163784] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 719.163784] env[60164]: raise self.value [ 719.163784] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 719.163784] env[60164]: updated_port = self._update_port( [ 719.163784] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 719.163784] env[60164]: _ensure_no_port_binding_failure(port) [ 719.163784] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 719.163784] env[60164]: raise exception.PortBindingFailed(port_id=port['id']) [ 719.164489] env[60164]: nova.exception.PortBindingFailed: Binding failed for port 22a86f22-b09e-42d4-94fe-94f6c03a4a0b, please check neutron logs for more information. [ 719.164489] env[60164]: Removing descriptor: 19 [ 719.164489] env[60164]: ERROR nova.compute.manager [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 22a86f22-b09e-42d4-94fe-94f6c03a4a0b, please check neutron logs for more information. [ 719.164489] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] Traceback (most recent call last): [ 719.164489] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] File "/opt/stack/nova/nova/compute/manager.py", line 2849, in _build_resources [ 719.164489] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] yield resources [ 719.164489] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 719.164489] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] self.driver.spawn(context, instance, image_meta, [ 719.164489] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 719.164489] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 719.164489] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 719.164489] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] vm_ref = self.build_virtual_machine(instance, [ 719.164831] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 719.164831] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] vif_infos = vmwarevif.get_vif_info(self._session, [ 719.164831] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 719.164831] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] for vif in network_info: [ 719.164831] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 719.164831] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] return self._sync_wrapper(fn, *args, **kwargs) [ 719.164831] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 719.164831] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] self.wait() [ 719.164831] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 719.164831] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] self[:] = self._gt.wait() [ 719.164831] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 719.164831] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] return self._exit_event.wait() [ 719.164831] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 719.164831] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] result = hub.switch() [ 719.165179] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 719.165179] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] return self.greenlet.switch() [ 719.165179] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 719.165179] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] result = function(*args, **kwargs) [ 719.165179] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 719.165179] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] return func(*args, **kwargs) [ 719.165179] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 719.165179] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] raise e [ 719.165179] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 719.165179] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] nwinfo = self.network_api.allocate_for_instance( [ 719.165179] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 719.165179] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] created_port_ids = self._update_ports_for_instance( [ 719.165179] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 719.165549] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] with excutils.save_and_reraise_exception(): [ 719.165549] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 719.165549] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] self.force_reraise() [ 719.165549] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 719.165549] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] raise self.value [ 719.165549] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 719.165549] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] updated_port = self._update_port( [ 719.165549] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 719.165549] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] _ensure_no_port_binding_failure(port) [ 719.165549] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 719.165549] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] raise exception.PortBindingFailed(port_id=port['id']) [ 719.165549] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] nova.exception.PortBindingFailed: Binding failed for port 22a86f22-b09e-42d4-94fe-94f6c03a4a0b, please check neutron logs for more information. [ 719.165549] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] [ 719.165970] env[60164]: INFO nova.compute.manager [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] Terminating instance [ 719.169866] env[60164]: DEBUG oslo_concurrency.lockutils [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Acquiring lock "refresh_cache-156cf534-81ca-4cc6-9b0d-2d245016c53c" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 719.169866] env[60164]: DEBUG oslo_concurrency.lockutils [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Acquired lock "refresh_cache-156cf534-81ca-4cc6-9b0d-2d245016c53c" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 719.169866] env[60164]: DEBUG nova.network.neutron [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 719.249070] env[60164]: DEBUG nova.network.neutron [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 719.277172] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-023ca255-f3ee-4f89-90f5-eaad99cca5b5 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.285605] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5faa37a5-6f76-4f90-8f1a-ebcebd4ea11a {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.323210] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48f0096c-7509-4fcd-b436-d78e39357871 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.334169] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96f8a57c-9567-4152-b7e3-497a22ee4e6d {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.351976] env[60164]: DEBUG nova.compute.provider_tree [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 719.361893] env[60164]: DEBUG nova.scheduler.client.report [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 719.381680] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.411s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 719.382175] env[60164]: DEBUG nova.compute.manager [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] [instance: b01c69b3-eec6-4577-8288-d4602da9e251] Start building networks asynchronously for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} [ 719.388949] env[60164]: DEBUG nova.network.neutron [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] Successfully created port: 3b7f356e-c9d5-4d4a-b070-7f55ba4a3e59 {{(pid=60164) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 719.399370] env[60164]: DEBUG nova.network.neutron [req-eacd423b-b30c-4e24-84b9-d67da5ac13b6 req-802f7acb-3325-4f3c-a42b-8a4173ab1eb1 service nova] [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 719.421062] env[60164]: DEBUG oslo_concurrency.lockutils [req-eacd423b-b30c-4e24-84b9-d67da5ac13b6 req-802f7acb-3325-4f3c-a42b-8a4173ab1eb1 service nova] Releasing lock "refresh_cache-c9c2d371-978e-4037-ba78-9b44f40765bd" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 719.421062] env[60164]: DEBUG oslo_concurrency.lockutils [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Acquired lock "refresh_cache-c9c2d371-978e-4037-ba78-9b44f40765bd" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 719.421062] env[60164]: DEBUG nova.network.neutron [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 719.443137] env[60164]: DEBUG nova.compute.utils [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] Using /dev/sd instead of None {{(pid=60164) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 719.444517] env[60164]: DEBUG nova.compute.manager [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] [instance: b01c69b3-eec6-4577-8288-d4602da9e251] Allocating IP information in the background. {{(pid=60164) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 719.444595] env[60164]: DEBUG nova.network.neutron [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] [instance: b01c69b3-eec6-4577-8288-d4602da9e251] allocate_for_instance() {{(pid=60164) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1143}} [ 719.458606] env[60164]: DEBUG nova.compute.manager [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] [instance: b01c69b3-eec6-4577-8288-d4602da9e251] Start building block device mappings for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} [ 719.530799] env[60164]: DEBUG nova.network.neutron [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 719.548844] env[60164]: DEBUG nova.compute.manager [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] [instance: b01c69b3-eec6-4577-8288-d4602da9e251] Start spawning the instance on the hypervisor. {{(pid=60164) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} [ 719.572124] env[60164]: DEBUG nova.virt.hardware [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-15T19:01:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-15T19:01:11Z,direct_url=,disk_format='vmdk',id=1618eb55-f00d-42a5-b978-e81e57855fb4,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='4c19b57a5b764a6480c1f6e2a9cdd5e4',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-15T19:01:12Z,virtual_size=,visibility=), allow threads: False {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 719.572124] env[60164]: DEBUG nova.virt.hardware [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] Flavor limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 719.572124] env[60164]: DEBUG nova.virt.hardware [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] Image limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 719.572340] env[60164]: DEBUG nova.virt.hardware [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] Flavor pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 719.572340] env[60164]: DEBUG nova.virt.hardware [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] Image pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 719.572340] env[60164]: DEBUG nova.virt.hardware [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 719.572439] env[60164]: DEBUG nova.virt.hardware [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 719.572540] env[60164]: DEBUG nova.virt.hardware [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 719.572700] env[60164]: DEBUG nova.virt.hardware [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] Got 1 possible topologies {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 719.572854] env[60164]: DEBUG nova.virt.hardware [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 719.573026] env[60164]: DEBUG nova.virt.hardware [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 719.575525] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-acb2e4dc-2e52-4a58-b3d2-0279d0c82f34 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.584384] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a18946d-fa7b-4a5d-98be-0f659b268003 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.700314] env[60164]: DEBUG nova.compute.manager [req-35bb01a5-8b92-4f42-bac3-ee4b28b11807 req-784c6e20-e6cb-422a-8f1a-389ca4e37f70 service nova] [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] Received event network-changed-22a86f22-b09e-42d4-94fe-94f6c03a4a0b {{(pid=60164) external_instance_event /opt/stack/nova/nova/compute/manager.py:10979}} [ 719.700314] env[60164]: DEBUG nova.compute.manager [req-35bb01a5-8b92-4f42-bac3-ee4b28b11807 req-784c6e20-e6cb-422a-8f1a-389ca4e37f70 service nova] [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] Refreshing instance network info cache due to event network-changed-22a86f22-b09e-42d4-94fe-94f6c03a4a0b. {{(pid=60164) external_instance_event /opt/stack/nova/nova/compute/manager.py:10984}} [ 719.700314] env[60164]: DEBUG oslo_concurrency.lockutils [req-35bb01a5-8b92-4f42-bac3-ee4b28b11807 req-784c6e20-e6cb-422a-8f1a-389ca4e37f70 service nova] Acquiring lock "refresh_cache-156cf534-81ca-4cc6-9b0d-2d245016c53c" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 719.734985] env[60164]: DEBUG nova.network.neutron [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 719.741738] env[60164]: DEBUG nova.policy [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dc8d455debe94abf852a3465d733d828', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a7f572f853ca47cd9a40604f8a7f6c36', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60164) authorize /opt/stack/nova/nova/policy.py:203}} [ 719.745065] env[60164]: DEBUG oslo_concurrency.lockutils [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Releasing lock "refresh_cache-156cf534-81ca-4cc6-9b0d-2d245016c53c" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 719.745479] env[60164]: DEBUG nova.compute.manager [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] Start destroying the instance on the hypervisor. {{(pid=60164) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} [ 719.745942] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] Destroying instance {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 719.746047] env[60164]: DEBUG oslo_concurrency.lockutils [req-35bb01a5-8b92-4f42-bac3-ee4b28b11807 req-784c6e20-e6cb-422a-8f1a-389ca4e37f70 service nova] Acquired lock "refresh_cache-156cf534-81ca-4cc6-9b0d-2d245016c53c" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 719.746759] env[60164]: DEBUG nova.network.neutron [req-35bb01a5-8b92-4f42-bac3-ee4b28b11807 req-784c6e20-e6cb-422a-8f1a-389ca4e37f70 service nova] [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] Refreshing network info cache for port 22a86f22-b09e-42d4-94fe-94f6c03a4a0b {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1986}} [ 719.747788] env[60164]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f8775199-6d15-4409-8ffa-ae98ac35c9ae {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.761778] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49bc3951-ebe1-49b9-9958-7c5536bca5a8 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.791361] env[60164]: WARNING nova.virt.vmwareapi.vmops [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 156cf534-81ca-4cc6-9b0d-2d245016c53c could not be found. [ 719.791634] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] Instance destroyed {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 719.791820] env[60164]: INFO nova.compute.manager [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] Took 0.05 seconds to destroy the instance on the hypervisor. [ 719.792090] env[60164]: DEBUG oslo.service.loopingcall [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60164) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 719.792326] env[60164]: DEBUG nova.compute.manager [-] [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 719.792420] env[60164]: DEBUG nova.network.neutron [-] [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 719.852437] env[60164]: DEBUG nova.network.neutron [req-35bb01a5-8b92-4f42-bac3-ee4b28b11807 req-784c6e20-e6cb-422a-8f1a-389ca4e37f70 service nova] [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 719.863151] env[60164]: DEBUG nova.network.neutron [-] [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 719.870704] env[60164]: DEBUG nova.network.neutron [-] [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 719.884791] env[60164]: INFO nova.compute.manager [-] [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] Took 0.09 seconds to deallocate network for instance. [ 719.884791] env[60164]: DEBUG nova.compute.claims [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] Aborting claim: {{(pid=60164) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 719.885218] env[60164]: DEBUG oslo_concurrency.lockutils [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 719.885218] env[60164]: DEBUG oslo_concurrency.lockutils [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 720.136137] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea9487bc-38e2-453d-823d-adb56112b1c0 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.144280] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d3862e9-3cdd-48a9-be55-30eaac3051fc {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.177307] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0918182f-0437-4b98-907e-6352ddc05059 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.185188] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e120474-c715-4639-a475-1d35a411018c {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.198517] env[60164]: DEBUG nova.compute.provider_tree [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 720.208265] env[60164]: DEBUG nova.scheduler.client.report [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 720.224549] env[60164]: DEBUG oslo_concurrency.lockutils [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.339s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 720.225236] env[60164]: ERROR nova.compute.manager [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 22a86f22-b09e-42d4-94fe-94f6c03a4a0b, please check neutron logs for more information. [ 720.225236] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] Traceback (most recent call last): [ 720.225236] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 720.225236] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] self.driver.spawn(context, instance, image_meta, [ 720.225236] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 720.225236] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 720.225236] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 720.225236] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] vm_ref = self.build_virtual_machine(instance, [ 720.225236] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 720.225236] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] vif_infos = vmwarevif.get_vif_info(self._session, [ 720.225236] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 720.225617] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] for vif in network_info: [ 720.225617] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 720.225617] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] return self._sync_wrapper(fn, *args, **kwargs) [ 720.225617] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 720.225617] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] self.wait() [ 720.225617] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 720.225617] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] self[:] = self._gt.wait() [ 720.225617] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 720.225617] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] return self._exit_event.wait() [ 720.225617] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 720.225617] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] result = hub.switch() [ 720.225617] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 720.225617] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] return self.greenlet.switch() [ 720.225617] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 720.226022] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] result = function(*args, **kwargs) [ 720.226022] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 720.226022] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] return func(*args, **kwargs) [ 720.226022] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 720.226022] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] raise e [ 720.226022] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 720.226022] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] nwinfo = self.network_api.allocate_for_instance( [ 720.226022] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 720.226022] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] created_port_ids = self._update_ports_for_instance( [ 720.226022] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 720.226022] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] with excutils.save_and_reraise_exception(): [ 720.226022] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 720.226022] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] self.force_reraise() [ 720.226401] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 720.226401] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] raise self.value [ 720.226401] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 720.226401] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] updated_port = self._update_port( [ 720.226401] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 720.226401] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] _ensure_no_port_binding_failure(port) [ 720.226401] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 720.226401] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] raise exception.PortBindingFailed(port_id=port['id']) [ 720.226401] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] nova.exception.PortBindingFailed: Binding failed for port 22a86f22-b09e-42d4-94fe-94f6c03a4a0b, please check neutron logs for more information. [ 720.226401] env[60164]: ERROR nova.compute.manager [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] [ 720.226707] env[60164]: DEBUG nova.compute.utils [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] Binding failed for port 22a86f22-b09e-42d4-94fe-94f6c03a4a0b, please check neutron logs for more information. {{(pid=60164) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 720.227948] env[60164]: DEBUG nova.compute.manager [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] Build of instance 156cf534-81ca-4cc6-9b0d-2d245016c53c was re-scheduled: Binding failed for port 22a86f22-b09e-42d4-94fe-94f6c03a4a0b, please check neutron logs for more information. {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 720.228384] env[60164]: DEBUG nova.compute.manager [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] Unplugging VIFs for instance {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} [ 720.228693] env[60164]: DEBUG oslo_concurrency.lockutils [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Acquiring lock "refresh_cache-156cf534-81ca-4cc6-9b0d-2d245016c53c" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 720.313163] env[60164]: DEBUG nova.network.neutron [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 720.323911] env[60164]: DEBUG oslo_concurrency.lockutils [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Releasing lock "refresh_cache-c9c2d371-978e-4037-ba78-9b44f40765bd" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 720.324153] env[60164]: DEBUG nova.compute.manager [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2984}} [ 720.324330] env[60164]: DEBUG nova.compute.manager [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 720.324485] env[60164]: DEBUG nova.network.neutron [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 720.394322] env[60164]: DEBUG nova.network.neutron [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 720.403951] env[60164]: DEBUG nova.network.neutron [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 720.414992] env[60164]: INFO nova.compute.manager [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c9c2d371-978e-4037-ba78-9b44f40765bd] Took 0.09 seconds to deallocate network for instance. [ 720.499971] env[60164]: DEBUG nova.network.neutron [req-35bb01a5-8b92-4f42-bac3-ee4b28b11807 req-784c6e20-e6cb-422a-8f1a-389ca4e37f70 service nova] [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 720.510387] env[60164]: DEBUG oslo_concurrency.lockutils [req-35bb01a5-8b92-4f42-bac3-ee4b28b11807 req-784c6e20-e6cb-422a-8f1a-389ca4e37f70 service nova] Releasing lock "refresh_cache-156cf534-81ca-4cc6-9b0d-2d245016c53c" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 720.510802] env[60164]: DEBUG oslo_concurrency.lockutils [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Acquired lock "refresh_cache-156cf534-81ca-4cc6-9b0d-2d245016c53c" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 720.511078] env[60164]: DEBUG nova.network.neutron [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 720.512651] env[60164]: INFO nova.scheduler.client.report [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Deleted allocations for instance c9c2d371-978e-4037-ba78-9b44f40765bd [ 720.543166] env[60164]: DEBUG oslo_concurrency.lockutils [None req-c6361768-afea-4756-bbb0-dabacec97540 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Lock "c9c2d371-978e-4037-ba78-9b44f40765bd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 16.247s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 720.562019] env[60164]: DEBUG nova.compute.manager [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] Starting instance... {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} [ 720.604038] env[60164]: DEBUG nova.network.neutron [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 720.616126] env[60164]: DEBUG oslo_concurrency.lockutils [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 720.616335] env[60164]: DEBUG oslo_concurrency.lockutils [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 720.617908] env[60164]: INFO nova.compute.claims [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 720.870886] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-180c6b77-3edc-4174-b7e4-91bd94fbaeed {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.878574] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3cc4af61-fcaf-43ea-a805-009ff19cef31 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.915630] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a6ce071-5b69-46ab-baca-2210cbdc6260 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.924348] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f8ed1ed-94b6-4817-809c-568935ec3389 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.939247] env[60164]: DEBUG nova.compute.provider_tree [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 720.948065] env[60164]: DEBUG nova.scheduler.client.report [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 720.961452] env[60164]: DEBUG oslo_concurrency.lockutils [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.345s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 720.961919] env[60164]: DEBUG nova.compute.manager [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] Start building networks asynchronously for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} [ 720.991213] env[60164]: DEBUG nova.network.neutron [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 720.998561] env[60164]: DEBUG nova.compute.utils [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] Using /dev/sd instead of None {{(pid=60164) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 721.000588] env[60164]: DEBUG nova.compute.manager [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] Allocating IP information in the background. {{(pid=60164) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 721.001713] env[60164]: DEBUG nova.network.neutron [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] allocate_for_instance() {{(pid=60164) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1143}} [ 721.002725] env[60164]: DEBUG oslo_concurrency.lockutils [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Releasing lock "refresh_cache-156cf534-81ca-4cc6-9b0d-2d245016c53c" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 721.002888] env[60164]: DEBUG nova.compute.manager [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2984}} [ 721.003073] env[60164]: DEBUG nova.compute.manager [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 721.003776] env[60164]: DEBUG nova.network.neutron [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 721.010483] env[60164]: DEBUG nova.compute.manager [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] Start building block device mappings for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} [ 721.078807] env[60164]: DEBUG nova.compute.manager [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] Start spawning the instance on the hypervisor. {{(pid=60164) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} [ 721.101797] env[60164]: DEBUG nova.virt.hardware [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-15T19:01:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-15T19:01:11Z,direct_url=,disk_format='vmdk',id=1618eb55-f00d-42a5-b978-e81e57855fb4,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='4c19b57a5b764a6480c1f6e2a9cdd5e4',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-15T19:01:12Z,virtual_size=,visibility=), allow threads: False {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 721.102075] env[60164]: DEBUG nova.virt.hardware [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] Flavor limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 721.102664] env[60164]: DEBUG nova.virt.hardware [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] Image limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 721.102664] env[60164]: DEBUG nova.virt.hardware [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] Flavor pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 721.102664] env[60164]: DEBUG nova.virt.hardware [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] Image pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 721.102664] env[60164]: DEBUG nova.virt.hardware [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 721.102907] env[60164]: DEBUG nova.virt.hardware [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 721.103200] env[60164]: DEBUG nova.virt.hardware [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 721.103426] env[60164]: DEBUG nova.virt.hardware [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] Got 1 possible topologies {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 721.103595] env[60164]: DEBUG nova.virt.hardware [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 721.103760] env[60164]: DEBUG nova.virt.hardware [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 721.104633] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04ec1ac1-ac66-4046-a6b7-6cee3294e89b {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 721.112847] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb233150-9cc7-421c-8ab4-fcfb9266c0ac {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 721.145283] env[60164]: DEBUG nova.policy [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dd73536207f046218914901213c53c5c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '33b22598eb074c35b7782df547b1cdea', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60164) authorize /opt/stack/nova/nova/policy.py:203}} [ 721.283038] env[60164]: DEBUG nova.network.neutron [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] [instance: b01c69b3-eec6-4577-8288-d4602da9e251] Successfully created port: 9670ba89-f99d-442f-a928-00c5989967cf {{(pid=60164) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 721.312978] env[60164]: DEBUG nova.network.neutron [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 721.331760] env[60164]: DEBUG nova.network.neutron [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 721.345534] env[60164]: INFO nova.compute.manager [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] [instance: 156cf534-81ca-4cc6-9b0d-2d245016c53c] Took 0.34 seconds to deallocate network for instance. [ 721.459016] env[60164]: INFO nova.scheduler.client.report [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Deleted allocations for instance 156cf534-81ca-4cc6-9b0d-2d245016c53c [ 721.485704] env[60164]: DEBUG oslo_concurrency.lockutils [None req-3856ad9c-4c98-4f9f-9fa6-f556526fd8a6 tempest-ServerRescueNegativeTestJSON-1532022264 tempest-ServerRescueNegativeTestJSON-1532022264-project-member] Lock "156cf534-81ca-4cc6-9b0d-2d245016c53c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 15.980s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 721.505098] env[60164]: DEBUG nova.compute.manager [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] Starting instance... {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} [ 721.564983] env[60164]: DEBUG oslo_concurrency.lockutils [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 721.564983] env[60164]: DEBUG oslo_concurrency.lockutils [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 721.565492] env[60164]: INFO nova.compute.claims [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 721.764701] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-70839dfd-1109-430d-b5a2-436874f6e2d8 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 721.774984] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29f9fecc-f6c0-4330-9317-bd9f067f8ac0 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 721.809451] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41bd771e-4553-4a06-acc3-6bb992928880 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 721.816838] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76371dc8-a7e2-4744-9a7f-c0d31fa37551 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 721.834815] env[60164]: DEBUG nova.compute.provider_tree [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 721.849834] env[60164]: DEBUG nova.scheduler.client.report [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 721.863915] env[60164]: DEBUG oslo_concurrency.lockutils [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.300s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 721.863915] env[60164]: DEBUG nova.compute.manager [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] Start building networks asynchronously for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} [ 721.897496] env[60164]: DEBUG nova.compute.utils [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Using /dev/sd instead of None {{(pid=60164) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 721.898917] env[60164]: DEBUG nova.compute.manager [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] Allocating IP information in the background. {{(pid=60164) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 721.899674] env[60164]: DEBUG nova.network.neutron [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] allocate_for_instance() {{(pid=60164) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1143}} [ 721.910672] env[60164]: DEBUG nova.compute.manager [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] Start building block device mappings for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} [ 721.991401] env[60164]: DEBUG nova.compute.manager [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] Start spawning the instance on the hypervisor. {{(pid=60164) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} [ 722.013459] env[60164]: DEBUG nova.virt.hardware [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-15T19:01:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-15T19:01:11Z,direct_url=,disk_format='vmdk',id=1618eb55-f00d-42a5-b978-e81e57855fb4,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='4c19b57a5b764a6480c1f6e2a9cdd5e4',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-15T19:01:12Z,virtual_size=,visibility=), allow threads: False {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 722.013716] env[60164]: DEBUG nova.virt.hardware [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Flavor limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 722.013873] env[60164]: DEBUG nova.virt.hardware [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Image limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 722.014065] env[60164]: DEBUG nova.virt.hardware [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Flavor pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 722.014211] env[60164]: DEBUG nova.virt.hardware [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Image pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 722.014354] env[60164]: DEBUG nova.virt.hardware [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 722.014945] env[60164]: DEBUG nova.virt.hardware [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 722.015178] env[60164]: DEBUG nova.virt.hardware [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 722.015441] env[60164]: DEBUG nova.virt.hardware [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Got 1 possible topologies {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 722.017332] env[60164]: DEBUG nova.virt.hardware [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 722.017332] env[60164]: DEBUG nova.virt.hardware [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 722.017332] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae12454b-2da7-485e-a84c-ef0ac0e0e468 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 722.025342] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a9a3b12-98d9-4450-a876-70c0c43af172 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 722.257042] env[60164]: DEBUG nova.network.neutron [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] Successfully created port: 4c38e9c1-0c1e-4465-acd5-be6ebc735624 {{(pid=60164) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 722.412704] env[60164]: ERROR nova.compute.manager [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 21decdcc-d46e-4851-8eee-8a89912b5691, please check neutron logs for more information. [ 722.412704] env[60164]: ERROR nova.compute.manager Traceback (most recent call last): [ 722.412704] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 722.412704] env[60164]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 722.412704] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 722.412704] env[60164]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 722.412704] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 722.412704] env[60164]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 722.412704] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 722.412704] env[60164]: ERROR nova.compute.manager self.force_reraise() [ 722.412704] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 722.412704] env[60164]: ERROR nova.compute.manager raise self.value [ 722.412704] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 722.412704] env[60164]: ERROR nova.compute.manager updated_port = self._update_port( [ 722.412704] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 722.412704] env[60164]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 722.413293] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 722.413293] env[60164]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 722.413293] env[60164]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 21decdcc-d46e-4851-8eee-8a89912b5691, please check neutron logs for more information. [ 722.413293] env[60164]: ERROR nova.compute.manager [ 722.413293] env[60164]: Traceback (most recent call last): [ 722.413293] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 722.413293] env[60164]: listener.cb(fileno) [ 722.413293] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 722.413293] env[60164]: result = function(*args, **kwargs) [ 722.413293] env[60164]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 722.413293] env[60164]: return func(*args, **kwargs) [ 722.413293] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 722.413293] env[60164]: raise e [ 722.413293] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 722.413293] env[60164]: nwinfo = self.network_api.allocate_for_instance( [ 722.413293] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 722.413293] env[60164]: created_port_ids = self._update_ports_for_instance( [ 722.413293] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 722.413293] env[60164]: with excutils.save_and_reraise_exception(): [ 722.413293] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 722.413293] env[60164]: self.force_reraise() [ 722.413293] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 722.413293] env[60164]: raise self.value [ 722.413293] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 722.413293] env[60164]: updated_port = self._update_port( [ 722.413293] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 722.413293] env[60164]: _ensure_no_port_binding_failure(port) [ 722.413293] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 722.413293] env[60164]: raise exception.PortBindingFailed(port_id=port['id']) [ 722.414177] env[60164]: nova.exception.PortBindingFailed: Binding failed for port 21decdcc-d46e-4851-8eee-8a89912b5691, please check neutron logs for more information. [ 722.414177] env[60164]: Removing descriptor: 20 [ 722.414177] env[60164]: ERROR nova.compute.manager [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 21decdcc-d46e-4851-8eee-8a89912b5691, please check neutron logs for more information. [ 722.414177] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] Traceback (most recent call last): [ 722.414177] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] File "/opt/stack/nova/nova/compute/manager.py", line 2849, in _build_resources [ 722.414177] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] yield resources [ 722.414177] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 722.414177] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] self.driver.spawn(context, instance, image_meta, [ 722.414177] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 722.414177] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 722.414177] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 722.414177] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] vm_ref = self.build_virtual_machine(instance, [ 722.414542] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 722.414542] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] vif_infos = vmwarevif.get_vif_info(self._session, [ 722.414542] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 722.414542] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] for vif in network_info: [ 722.414542] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 722.414542] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] return self._sync_wrapper(fn, *args, **kwargs) [ 722.414542] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 722.414542] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] self.wait() [ 722.414542] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 722.414542] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] self[:] = self._gt.wait() [ 722.414542] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 722.414542] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] return self._exit_event.wait() [ 722.414542] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 722.414542] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] result = hub.switch() [ 722.414981] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 722.414981] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] return self.greenlet.switch() [ 722.414981] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 722.414981] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] result = function(*args, **kwargs) [ 722.414981] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 722.414981] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] return func(*args, **kwargs) [ 722.414981] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 722.414981] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] raise e [ 722.414981] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 722.414981] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] nwinfo = self.network_api.allocate_for_instance( [ 722.414981] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 722.414981] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] created_port_ids = self._update_ports_for_instance( [ 722.414981] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 722.415379] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] with excutils.save_and_reraise_exception(): [ 722.415379] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 722.415379] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] self.force_reraise() [ 722.415379] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 722.415379] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] raise self.value [ 722.415379] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 722.415379] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] updated_port = self._update_port( [ 722.415379] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 722.415379] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] _ensure_no_port_binding_failure(port) [ 722.415379] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 722.415379] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] raise exception.PortBindingFailed(port_id=port['id']) [ 722.415379] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] nova.exception.PortBindingFailed: Binding failed for port 21decdcc-d46e-4851-8eee-8a89912b5691, please check neutron logs for more information. [ 722.415379] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] [ 722.417207] env[60164]: INFO nova.compute.manager [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] Terminating instance [ 722.417207] env[60164]: DEBUG oslo_concurrency.lockutils [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Acquiring lock "refresh_cache-c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 722.417207] env[60164]: DEBUG oslo_concurrency.lockutils [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Acquired lock "refresh_cache-c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 722.417207] env[60164]: DEBUG nova.network.neutron [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 722.465296] env[60164]: DEBUG nova.network.neutron [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 722.502440] env[60164]: DEBUG nova.policy [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e61100842e42452c920d522726703641', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '65241354dfa84a61977e9f11a0483dc9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60164) authorize /opt/stack/nova/nova/policy.py:203}} [ 722.949409] env[60164]: DEBUG nova.network.neutron [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 722.959612] env[60164]: DEBUG oslo_concurrency.lockutils [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Releasing lock "refresh_cache-c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 722.960044] env[60164]: DEBUG nova.compute.manager [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] Start destroying the instance on the hypervisor. {{(pid=60164) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} [ 722.960244] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] Destroying instance {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 722.960789] env[60164]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-836bbb48-2d5c-4105-bcb4-94694ce4fecb {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 722.971214] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0fa9cb69-f903-4076-9b0b-01c3020d8b23 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 723.003796] env[60164]: WARNING nova.virt.vmwareapi.vmops [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b could not be found. [ 723.004165] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] Instance destroyed {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 723.004217] env[60164]: INFO nova.compute.manager [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] Took 0.04 seconds to destroy the instance on the hypervisor. [ 723.004463] env[60164]: DEBUG oslo.service.loopingcall [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60164) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 723.004703] env[60164]: DEBUG nova.compute.manager [-] [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 723.004802] env[60164]: DEBUG nova.network.neutron [-] [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 723.076924] env[60164]: DEBUG nova.network.neutron [-] [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 723.094199] env[60164]: DEBUG nova.network.neutron [-] [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 723.115131] env[60164]: INFO nova.compute.manager [-] [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] Took 0.11 seconds to deallocate network for instance. [ 723.120481] env[60164]: DEBUG nova.compute.claims [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] Aborting claim: {{(pid=60164) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 723.120687] env[60164]: DEBUG oslo_concurrency.lockutils [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 723.120918] env[60164]: DEBUG oslo_concurrency.lockutils [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 723.248052] env[60164]: DEBUG oslo_concurrency.lockutils [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Acquiring lock "9614d3ee-0911-4b50-9875-93ef3f7f2b5f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 723.248052] env[60164]: DEBUG oslo_concurrency.lockutils [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Lock "9614d3ee-0911-4b50-9875-93ef3f7f2b5f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 723.373112] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3395bd8-1ed5-45f3-9b28-611e2dbb8916 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 723.380611] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-472f1cc8-48c2-4852-83f9-ba673ba1d42d {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 723.424748] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3e5ba92-4fb6-4241-bfa0-02a85c37755e {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 723.433496] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae77f834-b7fe-4571-90e5-566fb29d908c {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 723.452346] env[60164]: DEBUG nova.compute.provider_tree [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 723.471802] env[60164]: DEBUG nova.scheduler.client.report [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 723.486912] env[60164]: DEBUG oslo_concurrency.lockutils [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.366s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 723.487592] env[60164]: ERROR nova.compute.manager [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 21decdcc-d46e-4851-8eee-8a89912b5691, please check neutron logs for more information. [ 723.487592] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] Traceback (most recent call last): [ 723.487592] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 723.487592] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] self.driver.spawn(context, instance, image_meta, [ 723.487592] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 723.487592] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 723.487592] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 723.487592] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] vm_ref = self.build_virtual_machine(instance, [ 723.487592] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 723.487592] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] vif_infos = vmwarevif.get_vif_info(self._session, [ 723.487592] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 723.487991] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] for vif in network_info: [ 723.487991] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 723.487991] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] return self._sync_wrapper(fn, *args, **kwargs) [ 723.487991] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 723.487991] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] self.wait() [ 723.487991] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 723.487991] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] self[:] = self._gt.wait() [ 723.487991] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 723.487991] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] return self._exit_event.wait() [ 723.487991] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 723.487991] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] result = hub.switch() [ 723.487991] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 723.487991] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] return self.greenlet.switch() [ 723.487991] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 723.488387] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] result = function(*args, **kwargs) [ 723.488387] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 723.488387] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] return func(*args, **kwargs) [ 723.488387] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 723.488387] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] raise e [ 723.488387] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 723.488387] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] nwinfo = self.network_api.allocate_for_instance( [ 723.488387] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 723.488387] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] created_port_ids = self._update_ports_for_instance( [ 723.488387] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 723.488387] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] with excutils.save_and_reraise_exception(): [ 723.488387] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 723.488387] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] self.force_reraise() [ 723.488740] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 723.488740] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] raise self.value [ 723.488740] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 723.488740] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] updated_port = self._update_port( [ 723.488740] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 723.488740] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] _ensure_no_port_binding_failure(port) [ 723.488740] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 723.488740] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] raise exception.PortBindingFailed(port_id=port['id']) [ 723.488740] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] nova.exception.PortBindingFailed: Binding failed for port 21decdcc-d46e-4851-8eee-8a89912b5691, please check neutron logs for more information. [ 723.488740] env[60164]: ERROR nova.compute.manager [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] [ 723.488740] env[60164]: DEBUG nova.compute.utils [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] Binding failed for port 21decdcc-d46e-4851-8eee-8a89912b5691, please check neutron logs for more information. {{(pid=60164) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 723.490450] env[60164]: DEBUG nova.compute.manager [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] Build of instance c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b was re-scheduled: Binding failed for port 21decdcc-d46e-4851-8eee-8a89912b5691, please check neutron logs for more information. {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 723.490879] env[60164]: DEBUG nova.compute.manager [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] Unplugging VIFs for instance {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} [ 723.491134] env[60164]: DEBUG oslo_concurrency.lockutils [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Acquiring lock "refresh_cache-c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 723.491283] env[60164]: DEBUG oslo_concurrency.lockutils [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Acquired lock "refresh_cache-c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 723.491570] env[60164]: DEBUG nova.network.neutron [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 723.557566] env[60164]: DEBUG nova.network.neutron [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 723.577049] env[60164]: DEBUG oslo_concurrency.lockutils [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] Acquiring lock "ab6859e4-807d-4b5f-943b-6491ed211c75" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 723.577049] env[60164]: DEBUG oslo_concurrency.lockutils [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] Lock "ab6859e4-807d-4b5f-943b-6491ed211c75" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 724.104399] env[60164]: DEBUG nova.network.neutron [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 724.120434] env[60164]: DEBUG oslo_concurrency.lockutils [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Releasing lock "refresh_cache-c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 724.120434] env[60164]: DEBUG nova.compute.manager [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2984}} [ 724.120434] env[60164]: DEBUG nova.compute.manager [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 724.120434] env[60164]: DEBUG nova.network.neutron [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 724.176495] env[60164]: DEBUG nova.network.neutron [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 724.193219] env[60164]: DEBUG nova.network.neutron [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 724.218618] env[60164]: INFO nova.compute.manager [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b] Took 0.10 seconds to deallocate network for instance. [ 724.329658] env[60164]: INFO nova.scheduler.client.report [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Deleted allocations for instance c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b [ 724.352927] env[60164]: ERROR nova.compute.manager [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 3b7f356e-c9d5-4d4a-b070-7f55ba4a3e59, please check neutron logs for more information. [ 724.352927] env[60164]: ERROR nova.compute.manager Traceback (most recent call last): [ 724.352927] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 724.352927] env[60164]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 724.352927] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 724.352927] env[60164]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 724.352927] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 724.352927] env[60164]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 724.352927] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 724.352927] env[60164]: ERROR nova.compute.manager self.force_reraise() [ 724.352927] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 724.352927] env[60164]: ERROR nova.compute.manager raise self.value [ 724.352927] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 724.352927] env[60164]: ERROR nova.compute.manager updated_port = self._update_port( [ 724.352927] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 724.352927] env[60164]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 724.355726] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 724.355726] env[60164]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 724.355726] env[60164]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 3b7f356e-c9d5-4d4a-b070-7f55ba4a3e59, please check neutron logs for more information. [ 724.355726] env[60164]: ERROR nova.compute.manager [ 724.355726] env[60164]: Traceback (most recent call last): [ 724.355726] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 724.355726] env[60164]: listener.cb(fileno) [ 724.355726] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 724.355726] env[60164]: result = function(*args, **kwargs) [ 724.355726] env[60164]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 724.355726] env[60164]: return func(*args, **kwargs) [ 724.355726] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 724.355726] env[60164]: raise e [ 724.355726] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 724.355726] env[60164]: nwinfo = self.network_api.allocate_for_instance( [ 724.355726] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 724.355726] env[60164]: created_port_ids = self._update_ports_for_instance( [ 724.355726] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 724.355726] env[60164]: with excutils.save_and_reraise_exception(): [ 724.355726] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 724.355726] env[60164]: self.force_reraise() [ 724.355726] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 724.355726] env[60164]: raise self.value [ 724.355726] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 724.355726] env[60164]: updated_port = self._update_port( [ 724.355726] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 724.355726] env[60164]: _ensure_no_port_binding_failure(port) [ 724.355726] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 724.355726] env[60164]: raise exception.PortBindingFailed(port_id=port['id']) [ 724.356568] env[60164]: nova.exception.PortBindingFailed: Binding failed for port 3b7f356e-c9d5-4d4a-b070-7f55ba4a3e59, please check neutron logs for more information. [ 724.356568] env[60164]: Removing descriptor: 14 [ 724.356568] env[60164]: ERROR nova.compute.manager [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 3b7f356e-c9d5-4d4a-b070-7f55ba4a3e59, please check neutron logs for more information. [ 724.356568] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] Traceback (most recent call last): [ 724.356568] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] File "/opt/stack/nova/nova/compute/manager.py", line 2849, in _build_resources [ 724.356568] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] yield resources [ 724.356568] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 724.356568] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] self.driver.spawn(context, instance, image_meta, [ 724.356568] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 724.356568] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 724.356568] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 724.356568] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] vm_ref = self.build_virtual_machine(instance, [ 724.356873] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 724.356873] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] vif_infos = vmwarevif.get_vif_info(self._session, [ 724.356873] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 724.356873] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] for vif in network_info: [ 724.356873] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 724.356873] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] return self._sync_wrapper(fn, *args, **kwargs) [ 724.356873] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 724.356873] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] self.wait() [ 724.356873] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 724.356873] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] self[:] = self._gt.wait() [ 724.356873] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 724.356873] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] return self._exit_event.wait() [ 724.356873] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 724.356873] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] result = hub.switch() [ 724.357242] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 724.357242] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] return self.greenlet.switch() [ 724.357242] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 724.357242] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] result = function(*args, **kwargs) [ 724.357242] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 724.357242] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] return func(*args, **kwargs) [ 724.357242] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 724.357242] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] raise e [ 724.357242] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 724.357242] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] nwinfo = self.network_api.allocate_for_instance( [ 724.357242] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 724.357242] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] created_port_ids = self._update_ports_for_instance( [ 724.357242] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 724.357649] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] with excutils.save_and_reraise_exception(): [ 724.357649] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 724.357649] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] self.force_reraise() [ 724.357649] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 724.357649] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] raise self.value [ 724.357649] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 724.357649] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] updated_port = self._update_port( [ 724.357649] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 724.357649] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] _ensure_no_port_binding_failure(port) [ 724.357649] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 724.357649] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] raise exception.PortBindingFailed(port_id=port['id']) [ 724.357649] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] nova.exception.PortBindingFailed: Binding failed for port 3b7f356e-c9d5-4d4a-b070-7f55ba4a3e59, please check neutron logs for more information. [ 724.357649] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] [ 724.358051] env[60164]: INFO nova.compute.manager [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] Terminating instance [ 724.359783] env[60164]: DEBUG oslo_concurrency.lockutils [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Acquiring lock "refresh_cache-43fbb2e2-b827-4fc0-aff4-886a26f4550e" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 724.359783] env[60164]: DEBUG oslo_concurrency.lockutils [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Acquired lock "refresh_cache-43fbb2e2-b827-4fc0-aff4-886a26f4550e" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 724.359783] env[60164]: DEBUG nova.network.neutron [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 724.361566] env[60164]: DEBUG oslo_concurrency.lockutils [None req-7e828195-3163-4e64-9ad9-2ccd9055d3d5 tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Lock "c3e9f1b6-5feb-4d0a-ac70-67918b66fb0b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 17.162s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 724.379944] env[60164]: DEBUG nova.compute.manager [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] Starting instance... {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} [ 724.439076] env[60164]: DEBUG oslo_concurrency.lockutils [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 724.439330] env[60164]: DEBUG oslo_concurrency.lockutils [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 724.441213] env[60164]: INFO nova.compute.claims [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 724.446414] env[60164]: DEBUG nova.network.neutron [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 724.684933] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bcf36c56-4cf1-4961-986f-e5af82dd46b7 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 724.695334] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6fd533f-941e-450b-b1fb-b0223d70a58c {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 724.733212] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e7db87d-7d4b-4f83-8d83-4aa9f5289fd3 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 724.741113] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6aa7b177-d889-4bd6-956c-65bd8dbeff0e {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 724.755082] env[60164]: DEBUG nova.compute.provider_tree [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 724.764576] env[60164]: DEBUG nova.scheduler.client.report [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 724.779511] env[60164]: DEBUG oslo_concurrency.lockutils [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.340s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 724.780035] env[60164]: DEBUG nova.compute.manager [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] Start building networks asynchronously for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} [ 724.814568] env[60164]: DEBUG nova.compute.utils [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Using /dev/sd instead of None {{(pid=60164) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 724.816345] env[60164]: DEBUG nova.network.neutron [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 724.817705] env[60164]: DEBUG nova.compute.manager [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] Allocating IP information in the background. {{(pid=60164) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 724.817705] env[60164]: DEBUG nova.network.neutron [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] allocate_for_instance() {{(pid=60164) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1143}} [ 724.823739] env[60164]: DEBUG oslo_concurrency.lockutils [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Releasing lock "refresh_cache-43fbb2e2-b827-4fc0-aff4-886a26f4550e" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 724.824134] env[60164]: DEBUG nova.compute.manager [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] Start destroying the instance on the hypervisor. {{(pid=60164) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} [ 724.824323] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] Destroying instance {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 724.824956] env[60164]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-26818ae6-b077-41f6-9551-51f568086411 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 724.828521] env[60164]: DEBUG nova.compute.manager [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] Start building block device mappings for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} [ 724.837690] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7dfb04db-763a-44cd-82af-bb6507a05d8a {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 724.873803] env[60164]: WARNING nova.virt.vmwareapi.vmops [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 43fbb2e2-b827-4fc0-aff4-886a26f4550e could not be found. [ 724.874036] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] Instance destroyed {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 724.874213] env[60164]: INFO nova.compute.manager [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] Took 0.05 seconds to destroy the instance on the hypervisor. [ 724.874441] env[60164]: DEBUG oslo.service.loopingcall [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60164) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 724.874864] env[60164]: DEBUG nova.compute.manager [-] [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 724.874952] env[60164]: DEBUG nova.network.neutron [-] [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 724.903151] env[60164]: DEBUG nova.compute.manager [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] Start spawning the instance on the hypervisor. {{(pid=60164) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} [ 724.933058] env[60164]: DEBUG nova.virt.hardware [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-15T19:01:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-15T19:01:11Z,direct_url=,disk_format='vmdk',id=1618eb55-f00d-42a5-b978-e81e57855fb4,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='4c19b57a5b764a6480c1f6e2a9cdd5e4',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-15T19:01:12Z,virtual_size=,visibility=), allow threads: False {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 724.933058] env[60164]: DEBUG nova.virt.hardware [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Flavor limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 724.933058] env[60164]: DEBUG nova.virt.hardware [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Image limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 724.933220] env[60164]: DEBUG nova.virt.hardware [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Flavor pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 724.933251] env[60164]: DEBUG nova.virt.hardware [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Image pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 724.933383] env[60164]: DEBUG nova.virt.hardware [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 724.933566] env[60164]: DEBUG nova.virt.hardware [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 724.933712] env[60164]: DEBUG nova.virt.hardware [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 724.933872] env[60164]: DEBUG nova.virt.hardware [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Got 1 possible topologies {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 724.934596] env[60164]: DEBUG nova.virt.hardware [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 724.934803] env[60164]: DEBUG nova.virt.hardware [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 724.935710] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dbdd6027-3c78-43c5-9e41-ae494967dc37 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 724.946449] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f359f082-f875-4397-b143-80d1663b0e67 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 724.960426] env[60164]: DEBUG nova.network.neutron [-] [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 724.968831] env[60164]: DEBUG nova.network.neutron [-] [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 724.978757] env[60164]: INFO nova.compute.manager [-] [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] Took 0.10 seconds to deallocate network for instance. [ 724.981237] env[60164]: DEBUG nova.compute.claims [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] Aborting claim: {{(pid=60164) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 724.981350] env[60164]: DEBUG oslo_concurrency.lockutils [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 724.981560] env[60164]: DEBUG oslo_concurrency.lockutils [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 725.021545] env[60164]: DEBUG nova.policy [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e61100842e42452c920d522726703641', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '65241354dfa84a61977e9f11a0483dc9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60164) authorize /opt/stack/nova/nova/policy.py:203}} [ 725.129277] env[60164]: DEBUG nova.network.neutron [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] Successfully created port: ca937adf-d12a-4397-bdb1-e9c32bd7d7a4 {{(pid=60164) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 725.222067] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8fab1fe-8490-4431-91b9-4c1aea354f05 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 725.230306] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-72e0f317-70ce-45cd-bfc7-015b570441de {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 725.265826] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7fcf113-82f6-42a9-a1ec-2ef2431fe8bf {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 725.275408] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f794ea68-50b4-44c1-898b-a8f9017fa5ee {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 725.290382] env[60164]: DEBUG nova.compute.provider_tree [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 725.301268] env[60164]: DEBUG nova.scheduler.client.report [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 725.330840] env[60164]: DEBUG oslo_concurrency.lockutils [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.347s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 725.330840] env[60164]: ERROR nova.compute.manager [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 3b7f356e-c9d5-4d4a-b070-7f55ba4a3e59, please check neutron logs for more information. [ 725.330840] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] Traceback (most recent call last): [ 725.330840] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 725.330840] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] self.driver.spawn(context, instance, image_meta, [ 725.330840] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 725.330840] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 725.330840] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 725.330840] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] vm_ref = self.build_virtual_machine(instance, [ 725.331206] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 725.331206] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] vif_infos = vmwarevif.get_vif_info(self._session, [ 725.331206] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 725.331206] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] for vif in network_info: [ 725.331206] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 725.331206] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] return self._sync_wrapper(fn, *args, **kwargs) [ 725.331206] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 725.331206] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] self.wait() [ 725.331206] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 725.331206] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] self[:] = self._gt.wait() [ 725.331206] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 725.331206] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] return self._exit_event.wait() [ 725.331206] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 725.331206] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] result = hub.switch() [ 725.331555] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 725.331555] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] return self.greenlet.switch() [ 725.331555] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 725.331555] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] result = function(*args, **kwargs) [ 725.331555] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 725.331555] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] return func(*args, **kwargs) [ 725.331555] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 725.331555] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] raise e [ 725.331555] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 725.331555] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] nwinfo = self.network_api.allocate_for_instance( [ 725.331555] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 725.331555] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] created_port_ids = self._update_ports_for_instance( [ 725.331555] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 725.331892] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] with excutils.save_and_reraise_exception(): [ 725.331892] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 725.331892] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] self.force_reraise() [ 725.331892] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 725.331892] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] raise self.value [ 725.331892] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 725.331892] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] updated_port = self._update_port( [ 725.331892] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 725.331892] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] _ensure_no_port_binding_failure(port) [ 725.331892] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 725.331892] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] raise exception.PortBindingFailed(port_id=port['id']) [ 725.331892] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] nova.exception.PortBindingFailed: Binding failed for port 3b7f356e-c9d5-4d4a-b070-7f55ba4a3e59, please check neutron logs for more information. [ 725.331892] env[60164]: ERROR nova.compute.manager [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] [ 725.332237] env[60164]: DEBUG nova.compute.utils [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] Binding failed for port 3b7f356e-c9d5-4d4a-b070-7f55ba4a3e59, please check neutron logs for more information. {{(pid=60164) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 725.332237] env[60164]: DEBUG nova.compute.manager [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] Build of instance 43fbb2e2-b827-4fc0-aff4-886a26f4550e was re-scheduled: Binding failed for port 3b7f356e-c9d5-4d4a-b070-7f55ba4a3e59, please check neutron logs for more information. {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 725.332337] env[60164]: DEBUG nova.compute.manager [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] Unplugging VIFs for instance {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} [ 725.332903] env[60164]: DEBUG oslo_concurrency.lockutils [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Acquiring lock "refresh_cache-43fbb2e2-b827-4fc0-aff4-886a26f4550e" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 725.333778] env[60164]: DEBUG oslo_concurrency.lockutils [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Acquired lock "refresh_cache-43fbb2e2-b827-4fc0-aff4-886a26f4550e" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 725.333778] env[60164]: DEBUG nova.network.neutron [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 725.430922] env[60164]: DEBUG nova.network.neutron [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 725.812583] env[60164]: DEBUG nova.network.neutron [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 725.823647] env[60164]: DEBUG oslo_concurrency.lockutils [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Releasing lock "refresh_cache-43fbb2e2-b827-4fc0-aff4-886a26f4550e" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 725.823896] env[60164]: DEBUG nova.compute.manager [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2984}} [ 725.824097] env[60164]: DEBUG nova.compute.manager [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 725.824257] env[60164]: DEBUG nova.network.neutron [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 725.870855] env[60164]: DEBUG nova.network.neutron [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 725.881425] env[60164]: DEBUG nova.network.neutron [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 725.891212] env[60164]: INFO nova.compute.manager [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] [instance: 43fbb2e2-b827-4fc0-aff4-886a26f4550e] Took 0.07 seconds to deallocate network for instance. [ 726.005104] env[60164]: INFO nova.scheduler.client.report [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Deleted allocations for instance 43fbb2e2-b827-4fc0-aff4-886a26f4550e [ 726.032727] env[60164]: DEBUG oslo_concurrency.lockutils [None req-089040d8-fa96-4a76-be1b-d929bfa36203 tempest-ServerDiskConfigTestJSON-249548930 tempest-ServerDiskConfigTestJSON-249548930-project-member] Lock "43fbb2e2-b827-4fc0-aff4-886a26f4550e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 15.549s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 726.050891] env[60164]: DEBUG nova.compute.manager [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] Starting instance... {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} [ 726.109029] env[60164]: DEBUG oslo_concurrency.lockutils [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 726.111018] env[60164]: DEBUG oslo_concurrency.lockutils [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 726.111547] env[60164]: INFO nova.compute.claims [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 726.140101] env[60164]: ERROR nova.compute.manager [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 9670ba89-f99d-442f-a928-00c5989967cf, please check neutron logs for more information. [ 726.140101] env[60164]: ERROR nova.compute.manager Traceback (most recent call last): [ 726.140101] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 726.140101] env[60164]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 726.140101] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 726.140101] env[60164]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 726.140101] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 726.140101] env[60164]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 726.140101] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 726.140101] env[60164]: ERROR nova.compute.manager self.force_reraise() [ 726.140101] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 726.140101] env[60164]: ERROR nova.compute.manager raise self.value [ 726.140101] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 726.140101] env[60164]: ERROR nova.compute.manager updated_port = self._update_port( [ 726.140101] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 726.140101] env[60164]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 726.141068] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 726.141068] env[60164]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 726.141068] env[60164]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 9670ba89-f99d-442f-a928-00c5989967cf, please check neutron logs for more information. [ 726.141068] env[60164]: ERROR nova.compute.manager [ 726.141068] env[60164]: Traceback (most recent call last): [ 726.141068] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 726.141068] env[60164]: listener.cb(fileno) [ 726.141068] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 726.141068] env[60164]: result = function(*args, **kwargs) [ 726.141068] env[60164]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 726.141068] env[60164]: return func(*args, **kwargs) [ 726.141068] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 726.141068] env[60164]: raise e [ 726.141068] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 726.141068] env[60164]: nwinfo = self.network_api.allocate_for_instance( [ 726.141068] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 726.141068] env[60164]: created_port_ids = self._update_ports_for_instance( [ 726.141068] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 726.141068] env[60164]: with excutils.save_and_reraise_exception(): [ 726.141068] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 726.141068] env[60164]: self.force_reraise() [ 726.141068] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 726.141068] env[60164]: raise self.value [ 726.141068] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 726.141068] env[60164]: updated_port = self._update_port( [ 726.141068] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 726.141068] env[60164]: _ensure_no_port_binding_failure(port) [ 726.141068] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 726.141068] env[60164]: raise exception.PortBindingFailed(port_id=port['id']) [ 726.141865] env[60164]: nova.exception.PortBindingFailed: Binding failed for port 9670ba89-f99d-442f-a928-00c5989967cf, please check neutron logs for more information. [ 726.141865] env[60164]: Removing descriptor: 12 [ 726.141865] env[60164]: ERROR nova.compute.manager [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] [instance: b01c69b3-eec6-4577-8288-d4602da9e251] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 9670ba89-f99d-442f-a928-00c5989967cf, please check neutron logs for more information. [ 726.141865] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] Traceback (most recent call last): [ 726.141865] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] File "/opt/stack/nova/nova/compute/manager.py", line 2849, in _build_resources [ 726.141865] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] yield resources [ 726.141865] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 726.141865] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] self.driver.spawn(context, instance, image_meta, [ 726.141865] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 726.141865] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] self._vmops.spawn(context, instance, image_meta, injected_files, [ 726.141865] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 726.141865] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] vm_ref = self.build_virtual_machine(instance, [ 726.142345] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 726.142345] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] vif_infos = vmwarevif.get_vif_info(self._session, [ 726.142345] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 726.142345] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] for vif in network_info: [ 726.142345] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 726.142345] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] return self._sync_wrapper(fn, *args, **kwargs) [ 726.142345] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 726.142345] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] self.wait() [ 726.142345] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 726.142345] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] self[:] = self._gt.wait() [ 726.142345] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 726.142345] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] return self._exit_event.wait() [ 726.142345] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 726.142345] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] result = hub.switch() [ 726.142771] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 726.142771] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] return self.greenlet.switch() [ 726.142771] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 726.142771] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] result = function(*args, **kwargs) [ 726.142771] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 726.142771] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] return func(*args, **kwargs) [ 726.142771] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 726.142771] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] raise e [ 726.142771] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 726.142771] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] nwinfo = self.network_api.allocate_for_instance( [ 726.142771] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 726.142771] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] created_port_ids = self._update_ports_for_instance( [ 726.142771] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 726.143165] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] with excutils.save_and_reraise_exception(): [ 726.143165] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 726.143165] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] self.force_reraise() [ 726.143165] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 726.143165] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] raise self.value [ 726.143165] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 726.143165] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] updated_port = self._update_port( [ 726.143165] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 726.143165] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] _ensure_no_port_binding_failure(port) [ 726.143165] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 726.143165] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] raise exception.PortBindingFailed(port_id=port['id']) [ 726.143165] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] nova.exception.PortBindingFailed: Binding failed for port 9670ba89-f99d-442f-a928-00c5989967cf, please check neutron logs for more information. [ 726.143165] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] [ 726.144156] env[60164]: INFO nova.compute.manager [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] [instance: b01c69b3-eec6-4577-8288-d4602da9e251] Terminating instance [ 726.144610] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] Acquiring lock "refresh_cache-b01c69b3-eec6-4577-8288-d4602da9e251" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 726.144610] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] Acquired lock "refresh_cache-b01c69b3-eec6-4577-8288-d4602da9e251" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 726.144610] env[60164]: DEBUG nova.network.neutron [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] [instance: b01c69b3-eec6-4577-8288-d4602da9e251] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 726.194945] env[60164]: DEBUG nova.network.neutron [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] [instance: b01c69b3-eec6-4577-8288-d4602da9e251] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 726.377903] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02629134-463c-4c5d-9240-e42604bd2385 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 726.385801] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe5851bb-def7-405b-9051-d5e773b7e8c8 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 726.420305] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 726.423195] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81ef0d4a-2190-4c08-9e78-c31db03b29bc {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 726.430785] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-381168fc-5068-457d-80f7-e45c97d98b72 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 726.445408] env[60164]: DEBUG nova.compute.provider_tree [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 726.448084] env[60164]: ERROR nova.compute.manager [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port db9c5b4c-7f66-4453-85b1-d47606e0a329, please check neutron logs for more information. [ 726.448084] env[60164]: ERROR nova.compute.manager Traceback (most recent call last): [ 726.448084] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 726.448084] env[60164]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 726.448084] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 726.448084] env[60164]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 726.448084] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 726.448084] env[60164]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 726.448084] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 726.448084] env[60164]: ERROR nova.compute.manager self.force_reraise() [ 726.448084] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 726.448084] env[60164]: ERROR nova.compute.manager raise self.value [ 726.448084] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 726.448084] env[60164]: ERROR nova.compute.manager updated_port = self._update_port( [ 726.448084] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 726.448084] env[60164]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 726.448943] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 726.448943] env[60164]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 726.448943] env[60164]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port db9c5b4c-7f66-4453-85b1-d47606e0a329, please check neutron logs for more information. [ 726.448943] env[60164]: ERROR nova.compute.manager [ 726.448943] env[60164]: Traceback (most recent call last): [ 726.448943] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 726.448943] env[60164]: listener.cb(fileno) [ 726.448943] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 726.448943] env[60164]: result = function(*args, **kwargs) [ 726.448943] env[60164]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 726.448943] env[60164]: return func(*args, **kwargs) [ 726.448943] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 726.448943] env[60164]: raise e [ 726.448943] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 726.448943] env[60164]: nwinfo = self.network_api.allocate_for_instance( [ 726.448943] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 726.448943] env[60164]: created_port_ids = self._update_ports_for_instance( [ 726.448943] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 726.448943] env[60164]: with excutils.save_and_reraise_exception(): [ 726.448943] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 726.448943] env[60164]: self.force_reraise() [ 726.448943] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 726.448943] env[60164]: raise self.value [ 726.448943] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 726.448943] env[60164]: updated_port = self._update_port( [ 726.448943] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 726.448943] env[60164]: _ensure_no_port_binding_failure(port) [ 726.448943] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 726.448943] env[60164]: raise exception.PortBindingFailed(port_id=port['id']) [ 726.449690] env[60164]: nova.exception.PortBindingFailed: Binding failed for port db9c5b4c-7f66-4453-85b1-d47606e0a329, please check neutron logs for more information. [ 726.449690] env[60164]: Removing descriptor: 18 [ 726.449690] env[60164]: ERROR nova.compute.manager [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port db9c5b4c-7f66-4453-85b1-d47606e0a329, please check neutron logs for more information. [ 726.449690] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] Traceback (most recent call last): [ 726.449690] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] File "/opt/stack/nova/nova/compute/manager.py", line 2849, in _build_resources [ 726.449690] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] yield resources [ 726.449690] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 726.449690] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] self.driver.spawn(context, instance, image_meta, [ 726.449690] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 726.449690] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 726.449690] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 726.449690] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] vm_ref = self.build_virtual_machine(instance, [ 726.450073] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 726.450073] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] vif_infos = vmwarevif.get_vif_info(self._session, [ 726.450073] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 726.450073] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] for vif in network_info: [ 726.450073] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 726.450073] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] return self._sync_wrapper(fn, *args, **kwargs) [ 726.450073] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 726.450073] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] self.wait() [ 726.450073] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 726.450073] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] self[:] = self._gt.wait() [ 726.450073] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 726.450073] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] return self._exit_event.wait() [ 726.450073] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 726.450073] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] result = hub.switch() [ 726.450473] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 726.450473] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] return self.greenlet.switch() [ 726.450473] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 726.450473] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] result = function(*args, **kwargs) [ 726.450473] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 726.450473] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] return func(*args, **kwargs) [ 726.450473] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 726.450473] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] raise e [ 726.450473] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 726.450473] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] nwinfo = self.network_api.allocate_for_instance( [ 726.450473] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 726.450473] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] created_port_ids = self._update_ports_for_instance( [ 726.450473] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 726.450953] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] with excutils.save_and_reraise_exception(): [ 726.450953] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 726.450953] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] self.force_reraise() [ 726.450953] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 726.450953] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] raise self.value [ 726.450953] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 726.450953] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] updated_port = self._update_port( [ 726.450953] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 726.450953] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] _ensure_no_port_binding_failure(port) [ 726.450953] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 726.450953] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] raise exception.PortBindingFailed(port_id=port['id']) [ 726.450953] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] nova.exception.PortBindingFailed: Binding failed for port db9c5b4c-7f66-4453-85b1-d47606e0a329, please check neutron logs for more information. [ 726.450953] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] [ 726.451335] env[60164]: INFO nova.compute.manager [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] Terminating instance [ 726.452631] env[60164]: DEBUG oslo_concurrency.lockutils [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Acquiring lock "refresh_cache-fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 726.452785] env[60164]: DEBUG oslo_concurrency.lockutils [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Acquired lock "refresh_cache-fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 726.452946] env[60164]: DEBUG nova.network.neutron [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 726.461435] env[60164]: DEBUG nova.scheduler.client.report [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 726.479376] env[60164]: DEBUG oslo_concurrency.lockutils [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.370s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 726.480042] env[60164]: DEBUG nova.compute.manager [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] Start building networks asynchronously for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} [ 726.518038] env[60164]: DEBUG nova.network.neutron [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] Successfully created port: 40443b08-0a04-4ae9-9f18-468b0fb8d3e6 {{(pid=60164) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 726.526558] env[60164]: DEBUG nova.compute.utils [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Using /dev/sd instead of None {{(pid=60164) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 726.527310] env[60164]: DEBUG nova.compute.manager [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] Allocating IP information in the background. {{(pid=60164) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 726.527647] env[60164]: DEBUG nova.network.neutron [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] allocate_for_instance() {{(pid=60164) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1143}} [ 726.538262] env[60164]: DEBUG nova.compute.manager [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] Start building block device mappings for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} [ 726.575861] env[60164]: DEBUG nova.network.neutron [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 726.638667] env[60164]: DEBUG nova.compute.manager [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] Start spawning the instance on the hypervisor. {{(pid=60164) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} [ 726.663402] env[60164]: DEBUG nova.virt.hardware [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-15T19:01:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-15T19:01:11Z,direct_url=,disk_format='vmdk',id=1618eb55-f00d-42a5-b978-e81e57855fb4,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='4c19b57a5b764a6480c1f6e2a9cdd5e4',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-15T19:01:12Z,virtual_size=,visibility=), allow threads: False {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 726.663644] env[60164]: DEBUG nova.virt.hardware [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Flavor limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 726.663795] env[60164]: DEBUG nova.virt.hardware [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Image limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 726.663970] env[60164]: DEBUG nova.virt.hardware [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Flavor pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 726.664441] env[60164]: DEBUG nova.virt.hardware [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Image pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 726.664613] env[60164]: DEBUG nova.virt.hardware [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 726.664910] env[60164]: DEBUG nova.virt.hardware [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 726.665091] env[60164]: DEBUG nova.virt.hardware [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 726.665261] env[60164]: DEBUG nova.virt.hardware [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Got 1 possible topologies {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 726.665423] env[60164]: DEBUG nova.virt.hardware [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 726.665589] env[60164]: DEBUG nova.virt.hardware [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 726.666464] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6174e73c-a91d-4170-af90-1977fd147fff {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 726.675810] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14ab18f1-25aa-43a9-bad8-e4104357ed95 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 726.754255] env[60164]: DEBUG nova.policy [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '56bb638542d440639e1a38b10e80fb1e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0be6718d0cbe4351a06b59576311c7f8', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60164) authorize /opt/stack/nova/nova/policy.py:203}} [ 726.785727] env[60164]: DEBUG nova.network.neutron [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] [instance: b01c69b3-eec6-4577-8288-d4602da9e251] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 726.794994] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] Releasing lock "refresh_cache-b01c69b3-eec6-4577-8288-d4602da9e251" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 726.795407] env[60164]: DEBUG nova.compute.manager [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] [instance: b01c69b3-eec6-4577-8288-d4602da9e251] Start destroying the instance on the hypervisor. {{(pid=60164) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} [ 726.796035] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] [instance: b01c69b3-eec6-4577-8288-d4602da9e251] Destroying instance {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 726.796231] env[60164]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-ee2e3473-2e3e-4c39-963c-bed8e692b1cb {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 726.807946] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ca72ad3-3ed6-409d-9fa1-5643b73332d9 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 726.837995] env[60164]: WARNING nova.virt.vmwareapi.vmops [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] [instance: b01c69b3-eec6-4577-8288-d4602da9e251] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance b01c69b3-eec6-4577-8288-d4602da9e251 could not be found. [ 726.837995] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] [instance: b01c69b3-eec6-4577-8288-d4602da9e251] Instance destroyed {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 726.837995] env[60164]: INFO nova.compute.manager [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] [instance: b01c69b3-eec6-4577-8288-d4602da9e251] Took 0.04 seconds to destroy the instance on the hypervisor. [ 726.837995] env[60164]: DEBUG oslo.service.loopingcall [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60164) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 726.837995] env[60164]: DEBUG nova.compute.manager [-] [instance: b01c69b3-eec6-4577-8288-d4602da9e251] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 726.838229] env[60164]: DEBUG nova.network.neutron [-] [instance: b01c69b3-eec6-4577-8288-d4602da9e251] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 726.890162] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 726.925793] env[60164]: DEBUG nova.network.neutron [-] [instance: b01c69b3-eec6-4577-8288-d4602da9e251] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 726.933142] env[60164]: DEBUG nova.network.neutron [-] [instance: b01c69b3-eec6-4577-8288-d4602da9e251] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 726.944574] env[60164]: INFO nova.compute.manager [-] [instance: b01c69b3-eec6-4577-8288-d4602da9e251] Took 0.11 seconds to deallocate network for instance. [ 726.947677] env[60164]: DEBUG nova.compute.claims [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] [instance: b01c69b3-eec6-4577-8288-d4602da9e251] Aborting claim: {{(pid=60164) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 726.947809] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 726.949520] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 727.207027] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb6e04ba-8f53-4d8f-9449-2471b52b3936 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 727.215901] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-236b6c00-6e93-40a2-8f8f-095c57536c46 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 727.252841] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd0d73d5-9d61-45c9-8fea-8e40ad801900 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 727.263631] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0013d756-c36e-4bac-b736-c12c8793bb4e {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 727.282473] env[60164]: DEBUG nova.compute.provider_tree [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 727.304055] env[60164]: DEBUG nova.scheduler.client.report [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 727.345377] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.396s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 727.345377] env[60164]: ERROR nova.compute.manager [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] [instance: b01c69b3-eec6-4577-8288-d4602da9e251] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 9670ba89-f99d-442f-a928-00c5989967cf, please check neutron logs for more information. [ 727.345377] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] Traceback (most recent call last): [ 727.345377] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 727.345377] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] self.driver.spawn(context, instance, image_meta, [ 727.345377] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 727.345377] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] self._vmops.spawn(context, instance, image_meta, injected_files, [ 727.345377] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 727.345377] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] vm_ref = self.build_virtual_machine(instance, [ 727.345646] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 727.345646] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] vif_infos = vmwarevif.get_vif_info(self._session, [ 727.345646] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 727.345646] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] for vif in network_info: [ 727.345646] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 727.345646] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] return self._sync_wrapper(fn, *args, **kwargs) [ 727.345646] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 727.345646] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] self.wait() [ 727.345646] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 727.345646] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] self[:] = self._gt.wait() [ 727.345646] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 727.345646] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] return self._exit_event.wait() [ 727.345646] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 727.345646] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] result = hub.switch() [ 727.346089] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 727.346089] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] return self.greenlet.switch() [ 727.346089] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 727.346089] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] result = function(*args, **kwargs) [ 727.346089] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 727.346089] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] return func(*args, **kwargs) [ 727.346089] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 727.346089] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] raise e [ 727.346089] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 727.346089] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] nwinfo = self.network_api.allocate_for_instance( [ 727.346089] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 727.346089] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] created_port_ids = self._update_ports_for_instance( [ 727.346089] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 727.346428] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] with excutils.save_and_reraise_exception(): [ 727.346428] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 727.346428] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] self.force_reraise() [ 727.346428] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 727.346428] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] raise self.value [ 727.346428] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 727.346428] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] updated_port = self._update_port( [ 727.346428] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 727.346428] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] _ensure_no_port_binding_failure(port) [ 727.346428] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 727.346428] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] raise exception.PortBindingFailed(port_id=port['id']) [ 727.346428] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] nova.exception.PortBindingFailed: Binding failed for port 9670ba89-f99d-442f-a928-00c5989967cf, please check neutron logs for more information. [ 727.346428] env[60164]: ERROR nova.compute.manager [instance: b01c69b3-eec6-4577-8288-d4602da9e251] [ 727.346873] env[60164]: DEBUG nova.compute.utils [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] [instance: b01c69b3-eec6-4577-8288-d4602da9e251] Binding failed for port 9670ba89-f99d-442f-a928-00c5989967cf, please check neutron logs for more information. {{(pid=60164) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 727.348997] env[60164]: DEBUG nova.compute.manager [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] [instance: b01c69b3-eec6-4577-8288-d4602da9e251] Build of instance b01c69b3-eec6-4577-8288-d4602da9e251 was re-scheduled: Binding failed for port 9670ba89-f99d-442f-a928-00c5989967cf, please check neutron logs for more information. {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 727.349478] env[60164]: DEBUG nova.compute.manager [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] [instance: b01c69b3-eec6-4577-8288-d4602da9e251] Unplugging VIFs for instance {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} [ 727.349924] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] Acquiring lock "refresh_cache-b01c69b3-eec6-4577-8288-d4602da9e251" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 727.349924] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] Acquired lock "refresh_cache-b01c69b3-eec6-4577-8288-d4602da9e251" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 727.350163] env[60164]: DEBUG nova.network.neutron [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] [instance: b01c69b3-eec6-4577-8288-d4602da9e251] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 727.370150] env[60164]: DEBUG nova.network.neutron [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 727.385497] env[60164]: DEBUG oslo_concurrency.lockutils [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Releasing lock "refresh_cache-fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 727.385924] env[60164]: DEBUG nova.compute.manager [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] Start destroying the instance on the hypervisor. {{(pid=60164) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} [ 727.386124] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] Destroying instance {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 727.386625] env[60164]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-22af5e41-723c-4b86-95b7-a8efe4095cc9 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 727.397501] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c557e44-bfc3-4988-9501-31a8416c4bb6 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 727.423090] env[60164]: WARNING nova.virt.vmwareapi.vmops [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2 could not be found. [ 727.423090] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] Instance destroyed {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 727.423214] env[60164]: INFO nova.compute.manager [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] Took 0.04 seconds to destroy the instance on the hypervisor. [ 727.423391] env[60164]: DEBUG oslo.service.loopingcall [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60164) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 727.424055] env[60164]: DEBUG nova.compute.manager [-] [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 727.424055] env[60164]: DEBUG nova.network.neutron [-] [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 727.431980] env[60164]: DEBUG nova.network.neutron [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] [instance: b01c69b3-eec6-4577-8288-d4602da9e251] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 727.484488] env[60164]: ERROR nova.compute.manager [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 4c38e9c1-0c1e-4465-acd5-be6ebc735624, please check neutron logs for more information. [ 727.484488] env[60164]: ERROR nova.compute.manager Traceback (most recent call last): [ 727.484488] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 727.484488] env[60164]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 727.484488] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 727.484488] env[60164]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 727.484488] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 727.484488] env[60164]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 727.484488] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 727.484488] env[60164]: ERROR nova.compute.manager self.force_reraise() [ 727.484488] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 727.484488] env[60164]: ERROR nova.compute.manager raise self.value [ 727.484488] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 727.484488] env[60164]: ERROR nova.compute.manager updated_port = self._update_port( [ 727.484488] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 727.484488] env[60164]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 727.484982] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 727.484982] env[60164]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 727.484982] env[60164]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 4c38e9c1-0c1e-4465-acd5-be6ebc735624, please check neutron logs for more information. [ 727.484982] env[60164]: ERROR nova.compute.manager [ 727.484982] env[60164]: Traceback (most recent call last): [ 727.484982] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 727.484982] env[60164]: listener.cb(fileno) [ 727.484982] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 727.484982] env[60164]: result = function(*args, **kwargs) [ 727.484982] env[60164]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 727.484982] env[60164]: return func(*args, **kwargs) [ 727.484982] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 727.484982] env[60164]: raise e [ 727.484982] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 727.484982] env[60164]: nwinfo = self.network_api.allocate_for_instance( [ 727.484982] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 727.484982] env[60164]: created_port_ids = self._update_ports_for_instance( [ 727.484982] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 727.484982] env[60164]: with excutils.save_and_reraise_exception(): [ 727.484982] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 727.484982] env[60164]: self.force_reraise() [ 727.484982] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 727.484982] env[60164]: raise self.value [ 727.484982] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 727.484982] env[60164]: updated_port = self._update_port( [ 727.484982] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 727.484982] env[60164]: _ensure_no_port_binding_failure(port) [ 727.484982] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 727.484982] env[60164]: raise exception.PortBindingFailed(port_id=port['id']) [ 727.485809] env[60164]: nova.exception.PortBindingFailed: Binding failed for port 4c38e9c1-0c1e-4465-acd5-be6ebc735624, please check neutron logs for more information. [ 727.485809] env[60164]: Removing descriptor: 19 [ 727.485809] env[60164]: ERROR nova.compute.manager [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 4c38e9c1-0c1e-4465-acd5-be6ebc735624, please check neutron logs for more information. [ 727.485809] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] Traceback (most recent call last): [ 727.485809] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] File "/opt/stack/nova/nova/compute/manager.py", line 2849, in _build_resources [ 727.485809] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] yield resources [ 727.485809] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 727.485809] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] self.driver.spawn(context, instance, image_meta, [ 727.485809] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 727.485809] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 727.485809] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 727.485809] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] vm_ref = self.build_virtual_machine(instance, [ 727.486215] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 727.486215] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] vif_infos = vmwarevif.get_vif_info(self._session, [ 727.486215] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 727.486215] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] for vif in network_info: [ 727.486215] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 727.486215] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] return self._sync_wrapper(fn, *args, **kwargs) [ 727.486215] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 727.486215] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] self.wait() [ 727.486215] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 727.486215] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] self[:] = self._gt.wait() [ 727.486215] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 727.486215] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] return self._exit_event.wait() [ 727.486215] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 727.486215] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] result = hub.switch() [ 727.486663] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 727.486663] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] return self.greenlet.switch() [ 727.486663] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 727.486663] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] result = function(*args, **kwargs) [ 727.486663] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 727.486663] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] return func(*args, **kwargs) [ 727.486663] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 727.486663] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] raise e [ 727.486663] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 727.486663] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] nwinfo = self.network_api.allocate_for_instance( [ 727.486663] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 727.486663] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] created_port_ids = self._update_ports_for_instance( [ 727.486663] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 727.487036] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] with excutils.save_and_reraise_exception(): [ 727.487036] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 727.487036] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] self.force_reraise() [ 727.487036] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 727.487036] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] raise self.value [ 727.487036] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 727.487036] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] updated_port = self._update_port( [ 727.487036] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 727.487036] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] _ensure_no_port_binding_failure(port) [ 727.487036] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 727.487036] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] raise exception.PortBindingFailed(port_id=port['id']) [ 727.487036] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] nova.exception.PortBindingFailed: Binding failed for port 4c38e9c1-0c1e-4465-acd5-be6ebc735624, please check neutron logs for more information. [ 727.487036] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] [ 727.487409] env[60164]: INFO nova.compute.manager [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] Terminating instance [ 727.487775] env[60164]: DEBUG oslo_concurrency.lockutils [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] Acquiring lock "refresh_cache-fc85402b-7fcc-4060-b16a-f82d70d6886b" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 727.487929] env[60164]: DEBUG oslo_concurrency.lockutils [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] Acquired lock "refresh_cache-fc85402b-7fcc-4060-b16a-f82d70d6886b" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 727.488159] env[60164]: DEBUG nova.network.neutron [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 727.504256] env[60164]: DEBUG nova.network.neutron [-] [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 727.518508] env[60164]: DEBUG nova.network.neutron [-] [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 727.531142] env[60164]: INFO nova.compute.manager [-] [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] Took 0.11 seconds to deallocate network for instance. [ 727.533248] env[60164]: DEBUG nova.compute.claims [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] Aborting claim: {{(pid=60164) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 727.533841] env[60164]: DEBUG oslo_concurrency.lockutils [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 727.534095] env[60164]: DEBUG oslo_concurrency.lockutils [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 727.644047] env[60164]: WARNING oslo_vmware.rw_handles [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 727.644047] env[60164]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 727.644047] env[60164]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 727.644047] env[60164]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 727.644047] env[60164]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 727.644047] env[60164]: ERROR oslo_vmware.rw_handles response.begin() [ 727.644047] env[60164]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 727.644047] env[60164]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 727.644047] env[60164]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 727.644047] env[60164]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 727.644047] env[60164]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 727.644047] env[60164]: ERROR oslo_vmware.rw_handles [ 727.644047] env[60164]: DEBUG nova.virt.vmwareapi.images [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] Downloaded image file data 1618eb55-f00d-42a5-b978-e81e57855fb4 to vmware_temp/f37647a9-666c-4009-a4b2-3aa09632a939/1618eb55-f00d-42a5-b978-e81e57855fb4/tmp-sparse.vmdk on the data store datastore1 {{(pid=60164) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 727.645478] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] Caching image {{(pid=60164) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 727.645731] env[60164]: DEBUG nova.virt.vmwareapi.vm_util [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Copying Virtual Disk [datastore1] vmware_temp/f37647a9-666c-4009-a4b2-3aa09632a939/1618eb55-f00d-42a5-b978-e81e57855fb4/tmp-sparse.vmdk to [datastore1] vmware_temp/f37647a9-666c-4009-a4b2-3aa09632a939/1618eb55-f00d-42a5-b978-e81e57855fb4/1618eb55-f00d-42a5-b978-e81e57855fb4.vmdk {{(pid=60164) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 727.646417] env[60164]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-3de95850-5aad-41b2-a9fc-ea5f4f19e296 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 727.655484] env[60164]: DEBUG oslo_vmware.api [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Waiting for the task: (returnval){ [ 727.655484] env[60164]: value = "task-1295447" [ 727.655484] env[60164]: _type = "Task" [ 727.655484] env[60164]: } to complete. {{(pid=60164) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 727.665408] env[60164]: DEBUG oslo_vmware.api [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Task: {'id': task-1295447, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60164) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 727.718059] env[60164]: DEBUG nova.network.neutron [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 727.771260] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47080ce2-a646-4f1b-9b62-afb7cdf258ca {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 727.779515] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a271dbe-63f3-4aaa-9cce-5d798b99be99 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 727.812710] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77bb3348-f031-4338-86b4-d06c469a7441 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 727.819510] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96973556-4299-4001-931a-aa0cfafb6204 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 727.835908] env[60164]: DEBUG nova.compute.provider_tree [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 727.850760] env[60164]: DEBUG nova.scheduler.client.report [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 727.872876] env[60164]: DEBUG oslo_concurrency.lockutils [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.339s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 727.873509] env[60164]: ERROR nova.compute.manager [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port db9c5b4c-7f66-4453-85b1-d47606e0a329, please check neutron logs for more information. [ 727.873509] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] Traceback (most recent call last): [ 727.873509] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 727.873509] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] self.driver.spawn(context, instance, image_meta, [ 727.873509] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 727.873509] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 727.873509] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 727.873509] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] vm_ref = self.build_virtual_machine(instance, [ 727.873509] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 727.873509] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] vif_infos = vmwarevif.get_vif_info(self._session, [ 727.873509] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 727.873850] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] for vif in network_info: [ 727.873850] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 727.873850] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] return self._sync_wrapper(fn, *args, **kwargs) [ 727.873850] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 727.873850] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] self.wait() [ 727.873850] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 727.873850] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] self[:] = self._gt.wait() [ 727.873850] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 727.873850] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] return self._exit_event.wait() [ 727.873850] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 727.873850] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] result = hub.switch() [ 727.873850] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 727.873850] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] return self.greenlet.switch() [ 727.873850] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 727.874208] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] result = function(*args, **kwargs) [ 727.874208] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 727.874208] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] return func(*args, **kwargs) [ 727.874208] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 727.874208] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] raise e [ 727.874208] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 727.874208] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] nwinfo = self.network_api.allocate_for_instance( [ 727.874208] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 727.874208] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] created_port_ids = self._update_ports_for_instance( [ 727.874208] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 727.874208] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] with excutils.save_and_reraise_exception(): [ 727.874208] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 727.874208] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] self.force_reraise() [ 727.874536] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 727.874536] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] raise self.value [ 727.874536] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 727.874536] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] updated_port = self._update_port( [ 727.874536] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 727.874536] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] _ensure_no_port_binding_failure(port) [ 727.874536] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 727.874536] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] raise exception.PortBindingFailed(port_id=port['id']) [ 727.874536] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] nova.exception.PortBindingFailed: Binding failed for port db9c5b4c-7f66-4453-85b1-d47606e0a329, please check neutron logs for more information. [ 727.874536] env[60164]: ERROR nova.compute.manager [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] [ 727.874536] env[60164]: DEBUG nova.compute.utils [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] Binding failed for port db9c5b4c-7f66-4453-85b1-d47606e0a329, please check neutron logs for more information. {{(pid=60164) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 727.876884] env[60164]: DEBUG nova.compute.manager [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] Build of instance fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2 was re-scheduled: Binding failed for port db9c5b4c-7f66-4453-85b1-d47606e0a329, please check neutron logs for more information. {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 727.876884] env[60164]: DEBUG nova.compute.manager [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] Unplugging VIFs for instance {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} [ 727.876884] env[60164]: DEBUG oslo_concurrency.lockutils [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Acquiring lock "refresh_cache-fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 727.876884] env[60164]: DEBUG oslo_concurrency.lockutils [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Acquired lock "refresh_cache-fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 727.877659] env[60164]: DEBUG nova.network.neutron [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 727.884157] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 727.888191] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 727.888333] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 727.888835] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 727.889029] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60164) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10408}} [ 727.889252] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 727.898459] env[60164]: DEBUG oslo_concurrency.lockutils [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 727.898459] env[60164]: DEBUG oslo_concurrency.lockutils [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 727.898459] env[60164]: DEBUG oslo_concurrency.lockutils [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 727.898555] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60164) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 727.899539] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3fd093a3-893a-4668-8ce1-c1b7fc2d222f {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 727.908639] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95416821-5f06-4257-a910-9512acc680f9 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 727.924059] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2dea3070-f1ff-425e-a52c-450cec148d27 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 727.930947] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-818029bf-62be-461d-af69-1e3d767ccefc {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 727.965631] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181495MB free_disk=139GB free_vcpus=48 pci_devices=None {{(pid=60164) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 727.965631] env[60164]: DEBUG oslo_concurrency.lockutils [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 727.965631] env[60164]: DEBUG oslo_concurrency.lockutils [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 728.028649] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Instance 7466dfd3-8756-40eb-91fd-c87f16b627ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60164) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 728.028814] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Instance 68545276-63f2-4baf-8110-d3cc71686682 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60164) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 728.028952] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Instance b1361aa5-9bbd-4891-b74f-a0afd90b0bd6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60164) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 728.029118] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Instance 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60164) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 728.036568] env[60164]: DEBUG nova.network.neutron [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] [instance: b01c69b3-eec6-4577-8288-d4602da9e251] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 728.043434] env[60164]: DEBUG nova.network.neutron [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 728.052164] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Instance fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=60164) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1689}} [ 728.053268] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] Releasing lock "refresh_cache-b01c69b3-eec6-4577-8288-d4602da9e251" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 728.053486] env[60164]: DEBUG nova.compute.manager [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2984}} [ 728.053743] env[60164]: DEBUG nova.compute.manager [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] [instance: b01c69b3-eec6-4577-8288-d4602da9e251] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 728.053894] env[60164]: DEBUG nova.network.neutron [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] [instance: b01c69b3-eec6-4577-8288-d4602da9e251] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 728.065209] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Instance b01c69b3-eec6-4577-8288-d4602da9e251 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60164) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1689}} [ 728.065358] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Instance fc85402b-7fcc-4060-b16a-f82d70d6886b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60164) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 728.065482] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Instance e75afc9c-035c-4926-b72a-d570b5f2e6f0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60164) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 728.065606] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Instance 47d86b97-4bf1-40d4-b666-a530901d28dd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60164) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 728.065725] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Instance 9614d3ee-0911-4b50-9875-93ef3f7f2b5f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60164) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 728.072664] env[60164]: DEBUG nova.network.neutron [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 728.076079] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Instance ab6859e4-807d-4b5f-943b-6491ed211c75 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60164) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1689}} [ 728.076293] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Total usable vcpus: 48, total allocated vcpus: 8 {{(pid=60164) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 728.076437] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1536MB phys_disk=200GB used_disk=8GB total_vcpus=48 used_vcpus=8 pci_stats=[] {{(pid=60164) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 728.080980] env[60164]: DEBUG oslo_concurrency.lockutils [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] Releasing lock "refresh_cache-fc85402b-7fcc-4060-b16a-f82d70d6886b" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 728.081370] env[60164]: DEBUG nova.compute.manager [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] Start destroying the instance on the hypervisor. {{(pid=60164) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} [ 728.081558] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] Destroying instance {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 728.082101] env[60164]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f226dd80-6ddc-4ff8-8546-b55e23fa0fb4 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.091105] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a2396f3-921e-4bd5-a0ff-e819912b7f60 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.104876] env[60164]: DEBUG nova.network.neutron [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] [instance: b01c69b3-eec6-4577-8288-d4602da9e251] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 728.111539] env[60164]: DEBUG nova.network.neutron [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] [instance: b01c69b3-eec6-4577-8288-d4602da9e251] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 728.120978] env[60164]: WARNING nova.virt.vmwareapi.vmops [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance fc85402b-7fcc-4060-b16a-f82d70d6886b could not be found. [ 728.120978] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] Instance destroyed {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 728.120978] env[60164]: INFO nova.compute.manager [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] Took 0.04 seconds to destroy the instance on the hypervisor. [ 728.120978] env[60164]: DEBUG oslo.service.loopingcall [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60164) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 728.124171] env[60164]: DEBUG nova.compute.manager [-] [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 728.124171] env[60164]: DEBUG nova.network.neutron [-] [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 728.126202] env[60164]: INFO nova.compute.manager [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] [instance: b01c69b3-eec6-4577-8288-d4602da9e251] Took 0.07 seconds to deallocate network for instance. [ 728.185041] env[60164]: DEBUG nova.network.neutron [-] [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 728.186554] env[60164]: DEBUG oslo_vmware.exceptions [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Fault InvalidArgument not matched. {{(pid=60164) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 728.189780] env[60164]: DEBUG oslo_concurrency.lockutils [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Releasing lock "[datastore1] devstack-image-cache_base/1618eb55-f00d-42a5-b978-e81e57855fb4/1618eb55-f00d-42a5-b978-e81e57855fb4.vmdk" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 728.189780] env[60164]: ERROR nova.compute.manager [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 728.189780] env[60164]: Faults: ['InvalidArgument'] [ 728.189780] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] Traceback (most recent call last): [ 728.189780] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] File "/opt/stack/nova/nova/compute/manager.py", line 2849, in _build_resources [ 728.189780] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] yield resources [ 728.189780] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 728.189780] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] self.driver.spawn(context, instance, image_meta, [ 728.189780] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 728.189780] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] self._vmops.spawn(context, instance, image_meta, injected_files, [ 728.190186] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 728.190186] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] self._fetch_image_if_missing(context, vi) [ 728.190186] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 728.190186] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] image_cache(vi, tmp_image_ds_loc) [ 728.190186] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 728.190186] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] vm_util.copy_virtual_disk( [ 728.190186] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 728.190186] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] session._wait_for_task(vmdk_copy_task) [ 728.190186] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 728.190186] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] return self.wait_for_task(task_ref) [ 728.190186] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 728.190186] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] return evt.wait() [ 728.190186] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 728.190538] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] result = hub.switch() [ 728.190538] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 728.190538] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] return self.greenlet.switch() [ 728.190538] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 728.190538] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] self.f(*self.args, **self.kw) [ 728.190538] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 728.190538] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] raise exceptions.translate_fault(task_info.error) [ 728.190538] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 728.190538] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] Faults: ['InvalidArgument'] [ 728.190538] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] [ 728.190538] env[60164]: INFO nova.compute.manager [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] Terminating instance [ 728.190853] env[60164]: DEBUG oslo_concurrency.lockutils [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Acquired lock "[datastore1] devstack-image-cache_base/1618eb55-f00d-42a5-b978-e81e57855fb4/1618eb55-f00d-42a5-b978-e81e57855fb4.vmdk" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 728.190853] env[60164]: DEBUG nova.virt.vmwareapi.ds_util [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60164) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 728.191325] env[60164]: DEBUG oslo_concurrency.lockutils [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Acquiring lock "refresh_cache-7466dfd3-8756-40eb-91fd-c87f16b627ef" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 728.191325] env[60164]: DEBUG oslo_concurrency.lockutils [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Acquired lock "refresh_cache-7466dfd3-8756-40eb-91fd-c87f16b627ef" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 728.191424] env[60164]: DEBUG nova.network.neutron [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 728.192186] env[60164]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b8cc1a56-0a0f-41f1-823e-c635fb542888 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.197050] env[60164]: DEBUG nova.network.neutron [-] [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 728.203439] env[60164]: DEBUG nova.virt.vmwareapi.ds_util [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60164) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 728.203531] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60164) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 728.204428] env[60164]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ee743541-17ce-4190-b367-33303501d800 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.209754] env[60164]: DEBUG oslo_vmware.api [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Waiting for the task: (returnval){ [ 728.209754] env[60164]: value = "session[528ca5dc-e009-fd53-4682-e6b571cb4de5]52ba8398-039e-8fdb-764d-99e0871d7da5" [ 728.209754] env[60164]: _type = "Task" [ 728.209754] env[60164]: } to complete. {{(pid=60164) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 728.210612] env[60164]: INFO nova.compute.manager [-] [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] Took 0.09 seconds to deallocate network for instance. [ 728.216747] env[60164]: DEBUG nova.compute.claims [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] Aborting claim: {{(pid=60164) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 728.216747] env[60164]: DEBUG oslo_concurrency.lockutils [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 728.225291] env[60164]: DEBUG oslo_vmware.api [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Task: {'id': session[528ca5dc-e009-fd53-4682-e6b571cb4de5]52ba8398-039e-8fdb-764d-99e0871d7da5, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60164) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 728.238614] env[60164]: DEBUG nova.network.neutron [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] Successfully created port: 54daa46a-e3a4-4c3a-80ab-96f92c47ae45 {{(pid=60164) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 728.267997] env[60164]: INFO nova.scheduler.client.report [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] Deleted allocations for instance b01c69b3-eec6-4577-8288-d4602da9e251 [ 728.278064] env[60164]: DEBUG nova.network.neutron [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 728.301706] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d5dd79ad-47c2-4efe-b1c2-54c3c2bb9651 tempest-AttachVolumeTestJSON-1745626859 tempest-AttachVolumeTestJSON-1745626859-project-member] Lock "b01c69b3-eec6-4577-8288-d4602da9e251" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 16.049s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 728.323188] env[60164]: DEBUG nova.compute.manager [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] Starting instance... {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} [ 728.329022] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-816dfaae-c1bf-48ef-aea3-90e0c9feb89b {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.337124] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1bed13e8-30d5-4870-921b-acf84015def0 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.375447] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a759f088-034a-47cf-a590-ecd2bc5c54ff {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.388702] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80f85ab2-ceb8-4b0a-886f-5d44f883a9d8 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.403499] env[60164]: DEBUG nova.compute.provider_tree [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 728.405369] env[60164]: DEBUG oslo_concurrency.lockutils [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 728.418056] env[60164]: DEBUG nova.scheduler.client.report [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 728.437416] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60164) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 728.437416] env[60164]: DEBUG oslo_concurrency.lockutils [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.472s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 728.437416] env[60164]: DEBUG oslo_concurrency.lockutils [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.221s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 728.458191] env[60164]: DEBUG nova.network.neutron [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 728.472474] env[60164]: DEBUG nova.network.neutron [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 728.473669] env[60164]: DEBUG oslo_concurrency.lockutils [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Releasing lock "refresh_cache-7466dfd3-8756-40eb-91fd-c87f16b627ef" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 728.474188] env[60164]: DEBUG nova.compute.manager [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] Start destroying the instance on the hypervisor. {{(pid=60164) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} [ 728.474379] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] Destroying instance {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 728.476129] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16b1785e-856c-4015-8850-c3612814b4c6 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.484028] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] Unregistering the VM {{(pid=60164) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 728.484282] env[60164]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-e44386b3-925c-4050-a922-dbc2c048db75 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.496203] env[60164]: DEBUG oslo_concurrency.lockutils [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Releasing lock "refresh_cache-fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 728.496420] env[60164]: DEBUG nova.compute.manager [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2984}} [ 728.496595] env[60164]: DEBUG nova.compute.manager [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 728.496755] env[60164]: DEBUG nova.network.neutron [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 728.511689] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] Unregistered the VM {{(pid=60164) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 728.511993] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] Deleting contents of the VM from datastore datastore1 {{(pid=60164) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 728.512773] env[60164]: DEBUG nova.virt.vmwareapi.ds_util [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Deleting the datastore file [datastore1] 7466dfd3-8756-40eb-91fd-c87f16b627ef {{(pid=60164) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 728.512773] env[60164]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-4890e386-c349-448b-b9ca-2b9367d1ed61 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.519533] env[60164]: DEBUG oslo_vmware.api [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Waiting for the task: (returnval){ [ 728.519533] env[60164]: value = "task-1295449" [ 728.519533] env[60164]: _type = "Task" [ 728.519533] env[60164]: } to complete. {{(pid=60164) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 728.530346] env[60164]: DEBUG oslo_vmware.api [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Task: {'id': task-1295449, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60164) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 728.590969] env[60164]: DEBUG nova.network.neutron [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 728.611100] env[60164]: DEBUG nova.network.neutron [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 728.637994] env[60164]: INFO nova.compute.manager [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] [instance: fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2] Took 0.14 seconds to deallocate network for instance. [ 728.676096] env[60164]: DEBUG oslo_concurrency.lockutils [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] Acquiring lock "bd447698-8d52-4576-9d86-1a22e36bc3d5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 728.676364] env[60164]: DEBUG oslo_concurrency.lockutils [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] Lock "bd447698-8d52-4576-9d86-1a22e36bc3d5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 728.731608] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] [instance: 68545276-63f2-4baf-8110-d3cc71686682] Preparing fetch location {{(pid=60164) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 728.732388] env[60164]: DEBUG nova.virt.vmwareapi.ds_util [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Creating directory with path [datastore1] vmware_temp/85f47989-aaca-47a7-8633-ddb9e0a8d4eb/1618eb55-f00d-42a5-b978-e81e57855fb4 {{(pid=60164) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 728.732607] env[60164]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2fbaf2f5-c657-4663-8d5b-a9baf6db8a51 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.749889] env[60164]: DEBUG nova.virt.vmwareapi.ds_util [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Created directory with path [datastore1] vmware_temp/85f47989-aaca-47a7-8633-ddb9e0a8d4eb/1618eb55-f00d-42a5-b978-e81e57855fb4 {{(pid=60164) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 728.750045] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] [instance: 68545276-63f2-4baf-8110-d3cc71686682] Fetch image to [datastore1] vmware_temp/85f47989-aaca-47a7-8633-ddb9e0a8d4eb/1618eb55-f00d-42a5-b978-e81e57855fb4/tmp-sparse.vmdk {{(pid=60164) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 728.750212] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] [instance: 68545276-63f2-4baf-8110-d3cc71686682] Downloading image file data 1618eb55-f00d-42a5-b978-e81e57855fb4 to [datastore1] vmware_temp/85f47989-aaca-47a7-8633-ddb9e0a8d4eb/1618eb55-f00d-42a5-b978-e81e57855fb4/tmp-sparse.vmdk on the data store datastore1 {{(pid=60164) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 728.754976] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df407cb8-3608-4eca-bae9-b9b49217dc7d {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.764666] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c6fd513-0c32-41bd-bc00-e7ead8b507f0 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.779266] env[60164]: INFO nova.scheduler.client.report [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Deleted allocations for instance fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2 [ 728.794264] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78bbc98b-6d42-41c6-9eac-12c008763cf2 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.799101] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c575740f-7c74-465a-82f4-8395e8d2442a {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.835800] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-050fd505-5bec-4f73-9329-0c51079769b0 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.840604] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c628b98-1659-4664-b5cb-f7d12249e02a {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.843294] env[60164]: DEBUG oslo_concurrency.lockutils [None req-1b405b20-af63-4812-ba02-dc6023967f4b tempest-ListServerFiltersTestJSON-1580368180 tempest-ListServerFiltersTestJSON-1580368180-project-member] Lock "fd4ad598-3fa2-4a7a-9226-9cf9dba03ce2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 18.656s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 728.874743] env[60164]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-556aaf4f-b97d-49c7-a3f5-110ac7fe1e59 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.878049] env[60164]: DEBUG nova.compute.manager [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] Starting instance... {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} [ 728.883837] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f903a4f4-15ea-4eaa-a694-2eeb9bac50c0 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.889658] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b07612a6-fab1-488a-88ba-f80728d0621e {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.904635] env[60164]: DEBUG nova.compute.provider_tree [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 728.918803] env[60164]: DEBUG nova.scheduler.client.report [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 728.933746] env[60164]: DEBUG oslo_concurrency.lockutils [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.496s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 728.934444] env[60164]: ERROR nova.compute.manager [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 4c38e9c1-0c1e-4465-acd5-be6ebc735624, please check neutron logs for more information. [ 728.934444] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] Traceback (most recent call last): [ 728.934444] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 728.934444] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] self.driver.spawn(context, instance, image_meta, [ 728.934444] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 728.934444] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 728.934444] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 728.934444] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] vm_ref = self.build_virtual_machine(instance, [ 728.934444] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 728.934444] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] vif_infos = vmwarevif.get_vif_info(self._session, [ 728.934444] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 728.934847] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] for vif in network_info: [ 728.934847] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 728.934847] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] return self._sync_wrapper(fn, *args, **kwargs) [ 728.934847] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 728.934847] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] self.wait() [ 728.934847] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 728.934847] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] self[:] = self._gt.wait() [ 728.934847] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 728.934847] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] return self._exit_event.wait() [ 728.934847] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 728.934847] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] result = hub.switch() [ 728.934847] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 728.934847] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] return self.greenlet.switch() [ 728.934847] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 728.935282] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] result = function(*args, **kwargs) [ 728.935282] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 728.935282] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] return func(*args, **kwargs) [ 728.935282] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 728.935282] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] raise e [ 728.935282] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 728.935282] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] nwinfo = self.network_api.allocate_for_instance( [ 728.935282] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 728.935282] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] created_port_ids = self._update_ports_for_instance( [ 728.935282] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 728.935282] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] with excutils.save_and_reraise_exception(): [ 728.935282] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 728.935282] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] self.force_reraise() [ 728.935708] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 728.935708] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] raise self.value [ 728.935708] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 728.935708] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] updated_port = self._update_port( [ 728.935708] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 728.935708] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] _ensure_no_port_binding_failure(port) [ 728.935708] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 728.935708] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] raise exception.PortBindingFailed(port_id=port['id']) [ 728.935708] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] nova.exception.PortBindingFailed: Binding failed for port 4c38e9c1-0c1e-4465-acd5-be6ebc735624, please check neutron logs for more information. [ 728.935708] env[60164]: ERROR nova.compute.manager [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] [ 728.936199] env[60164]: DEBUG nova.compute.utils [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] Binding failed for port 4c38e9c1-0c1e-4465-acd5-be6ebc735624, please check neutron logs for more information. {{(pid=60164) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 728.937414] env[60164]: DEBUG oslo_concurrency.lockutils [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 728.937797] env[60164]: DEBUG oslo_concurrency.lockutils [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.532s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 728.939205] env[60164]: INFO nova.compute.claims [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 728.942068] env[60164]: DEBUG nova.compute.manager [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] Build of instance fc85402b-7fcc-4060-b16a-f82d70d6886b was re-scheduled: Binding failed for port 4c38e9c1-0c1e-4465-acd5-be6ebc735624, please check neutron logs for more information. {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 728.942546] env[60164]: DEBUG nova.compute.manager [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] Unplugging VIFs for instance {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} [ 728.942733] env[60164]: DEBUG oslo_concurrency.lockutils [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] Acquiring lock "refresh_cache-fc85402b-7fcc-4060-b16a-f82d70d6886b" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 728.942879] env[60164]: DEBUG oslo_concurrency.lockutils [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] Acquired lock "refresh_cache-fc85402b-7fcc-4060-b16a-f82d70d6886b" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 728.943046] env[60164]: DEBUG nova.network.neutron [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 728.971540] env[60164]: DEBUG nova.virt.vmwareapi.images [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] [instance: 68545276-63f2-4baf-8110-d3cc71686682] Downloading image file data 1618eb55-f00d-42a5-b978-e81e57855fb4 to the data store datastore1 {{(pid=60164) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 729.020779] env[60164]: DEBUG nova.network.neutron [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 729.029334] env[60164]: DEBUG oslo_vmware.rw_handles [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/85f47989-aaca-47a7-8633-ddb9e0a8d4eb/1618eb55-f00d-42a5-b978-e81e57855fb4/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60164) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 729.035961] env[60164]: DEBUG oslo_vmware.api [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Task: {'id': task-1295449, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.043884} completed successfully. {{(pid=60164) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 729.087687] env[60164]: DEBUG nova.virt.vmwareapi.ds_util [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Deleted the datastore file {{(pid=60164) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 729.087687] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] Deleted contents of the VM from datastore datastore1 {{(pid=60164) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 729.087687] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] Instance destroyed {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 729.087687] env[60164]: INFO nova.compute.manager [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] Took 0.61 seconds to destroy the instance on the hypervisor. [ 729.087978] env[60164]: DEBUG oslo.service.loopingcall [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60164) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 729.088455] env[60164]: DEBUG nova.compute.manager [-] [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] Skipping network deallocation for instance since networking was not requested. {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2260}} [ 729.091808] env[60164]: DEBUG nova.compute.claims [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] Aborting claim: {{(pid=60164) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 729.092191] env[60164]: DEBUG oslo_concurrency.lockutils [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 729.092553] env[60164]: DEBUG oslo_vmware.rw_handles [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Completed reading data from the image iterator. {{(pid=60164) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 729.092681] env[60164]: DEBUG oslo_vmware.rw_handles [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/85f47989-aaca-47a7-8633-ddb9e0a8d4eb/1618eb55-f00d-42a5-b978-e81e57855fb4/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60164) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 729.204406] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb0d7f78-ffc9-43f9-a316-f92d21064698 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.212416] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-391e389b-9b60-4b19-bc19-579112e7bb30 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.246566] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f81fa783-931a-4553-9fd3-269f8ad417b3 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.254451] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc013eb0-c90e-495c-befb-32f4fcf61779 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.267623] env[60164]: DEBUG nova.compute.provider_tree [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 729.277652] env[60164]: DEBUG nova.scheduler.client.report [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 729.296707] env[60164]: DEBUG oslo_concurrency.lockutils [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.359s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 729.297214] env[60164]: DEBUG nova.compute.manager [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] Start building networks asynchronously for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} [ 729.299622] env[60164]: DEBUG oslo_concurrency.lockutils [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.362s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 729.300990] env[60164]: INFO nova.compute.claims [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 729.345596] env[60164]: DEBUG nova.compute.utils [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] Using /dev/sd instead of None {{(pid=60164) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 729.347957] env[60164]: DEBUG nova.compute.manager [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] Allocating IP information in the background. {{(pid=60164) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 729.347957] env[60164]: DEBUG nova.network.neutron [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] allocate_for_instance() {{(pid=60164) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1143}} [ 729.357808] env[60164]: DEBUG nova.compute.manager [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] Start building block device mappings for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} [ 729.439535] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 729.439746] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Starting heal instance info cache {{(pid=60164) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9789}} [ 729.439878] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Rebuilding the list of instances to heal {{(pid=60164) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9793}} [ 729.465201] env[60164]: DEBUG nova.policy [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd7aaa8ce958041a48a24e19ccdc295b4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b00aed0dfc1040dab445131a92d0ef27', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60164) authorize /opt/stack/nova/nova/policy.py:203}} [ 729.469887] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] Skipping network cache update for instance because it is Building. {{(pid=60164) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9802}} [ 729.469981] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] [instance: 68545276-63f2-4baf-8110-d3cc71686682] Skipping network cache update for instance because it is Building. {{(pid=60164) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9802}} [ 729.470118] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] Skipping network cache update for instance because it is Building. {{(pid=60164) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9802}} [ 729.470227] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] Skipping network cache update for instance because it is Building. {{(pid=60164) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9802}} [ 729.470323] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] Skipping network cache update for instance because it is Building. {{(pid=60164) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9802}} [ 729.470439] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] Skipping network cache update for instance because it is Building. {{(pid=60164) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9802}} [ 729.470559] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] Skipping network cache update for instance because it is Building. {{(pid=60164) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9802}} [ 729.470693] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] Skipping network cache update for instance because it is Building. {{(pid=60164) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9802}} [ 729.470810] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] Skipping network cache update for instance because it is Building. {{(pid=60164) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9802}} [ 729.470925] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Didn't find any instances for network info cache update. {{(pid=60164) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9875}} [ 729.471943] env[60164]: DEBUG nova.compute.manager [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] Start spawning the instance on the hypervisor. {{(pid=60164) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} [ 729.507544] env[60164]: DEBUG nova.virt.hardware [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-15T19:01:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-15T19:01:11Z,direct_url=,disk_format='vmdk',id=1618eb55-f00d-42a5-b978-e81e57855fb4,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='4c19b57a5b764a6480c1f6e2a9cdd5e4',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-15T19:01:12Z,virtual_size=,visibility=), allow threads: False {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 729.507904] env[60164]: DEBUG nova.virt.hardware [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] Flavor limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 729.508087] env[60164]: DEBUG nova.virt.hardware [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] Image limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 729.508274] env[60164]: DEBUG nova.virt.hardware [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] Flavor pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 729.508414] env[60164]: DEBUG nova.virt.hardware [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] Image pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 729.508554] env[60164]: DEBUG nova.virt.hardware [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 729.508754] env[60164]: DEBUG nova.virt.hardware [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 729.508909] env[60164]: DEBUG nova.virt.hardware [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 729.509083] env[60164]: DEBUG nova.virt.hardware [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] Got 1 possible topologies {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 729.509244] env[60164]: DEBUG nova.virt.hardware [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 729.509411] env[60164]: DEBUG nova.virt.hardware [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 729.510535] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03be57bb-e7b0-486e-860c-e3dee24aef4b {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.519165] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ca7026f-d773-43ec-8c86-585f5fba6cc4 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.536242] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d4ac0c7-77a0-43c4-b16c-fd15ea080967 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.543104] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ecba7b7-bd56-4c07-8c74-99566363347f {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.576037] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9fdd20cd-e233-44b4-9c38-adfd038c5cfd {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.583911] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b919610-3200-46b6-9b33-02a0850c8f67 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.599723] env[60164]: DEBUG nova.compute.provider_tree [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 729.609180] env[60164]: DEBUG nova.scheduler.client.report [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 729.622937] env[60164]: DEBUG oslo_concurrency.lockutils [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.323s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 729.623425] env[60164]: DEBUG nova.compute.manager [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] Start building networks asynchronously for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} [ 729.625747] env[60164]: DEBUG oslo_concurrency.lockutils [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.534s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 729.661453] env[60164]: DEBUG nova.compute.utils [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] Using /dev/sd instead of None {{(pid=60164) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 729.662840] env[60164]: DEBUG nova.compute.manager [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] Allocating IP information in the background. {{(pid=60164) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 729.663024] env[60164]: DEBUG nova.network.neutron [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] allocate_for_instance() {{(pid=60164) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1143}} [ 729.682824] env[60164]: DEBUG nova.compute.manager [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] Start building block device mappings for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} [ 729.694404] env[60164]: DEBUG nova.network.neutron [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 729.704677] env[60164]: DEBUG oslo_concurrency.lockutils [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] Releasing lock "refresh_cache-fc85402b-7fcc-4060-b16a-f82d70d6886b" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 729.705016] env[60164]: DEBUG nova.compute.manager [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2984}} [ 729.705239] env[60164]: DEBUG nova.compute.manager [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 729.705548] env[60164]: DEBUG nova.network.neutron [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 729.762442] env[60164]: DEBUG nova.network.neutron [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 729.765492] env[60164]: DEBUG nova.compute.manager [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] Start spawning the instance on the hypervisor. {{(pid=60164) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} [ 729.772350] env[60164]: DEBUG nova.network.neutron [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 729.782675] env[60164]: INFO nova.compute.manager [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] [instance: fc85402b-7fcc-4060-b16a-f82d70d6886b] Took 0.08 seconds to deallocate network for instance. [ 729.806504] env[60164]: DEBUG nova.virt.hardware [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-15T19:01:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-15T19:01:11Z,direct_url=,disk_format='vmdk',id=1618eb55-f00d-42a5-b978-e81e57855fb4,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='4c19b57a5b764a6480c1f6e2a9cdd5e4',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-15T19:01:12Z,virtual_size=,visibility=), allow threads: False {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 729.806759] env[60164]: DEBUG nova.virt.hardware [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] Flavor limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 729.806829] env[60164]: DEBUG nova.virt.hardware [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] Image limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 729.806998] env[60164]: DEBUG nova.virt.hardware [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] Flavor pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 729.807156] env[60164]: DEBUG nova.virt.hardware [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] Image pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 729.807290] env[60164]: DEBUG nova.virt.hardware [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 729.807487] env[60164]: DEBUG nova.virt.hardware [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 729.807691] env[60164]: DEBUG nova.virt.hardware [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 729.807808] env[60164]: DEBUG nova.virt.hardware [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] Got 1 possible topologies {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 729.808321] env[60164]: DEBUG nova.virt.hardware [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 729.808321] env[60164]: DEBUG nova.virt.hardware [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 729.809178] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-137278af-c833-47b2-af02-d398aeb969a6 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.820416] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8084768f-0a2f-48c6-9d88-eb33429df207 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.849103] env[60164]: DEBUG nova.policy [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9a32a17cf2b0429d97c45e2e0574d14f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e05387f01e784108985ef588ab2b8094', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60164) authorize /opt/stack/nova/nova/policy.py:203}} [ 729.855834] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-069495a7-9746-4ca5-918b-29d1f92d21cc {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.863241] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a6574fd-2818-454c-a6d8-91f38ab7be1e {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.896759] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 729.901643] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bdd3adae-711b-44ca-b169-40800aa650e5 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.908541] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-136376c4-5327-470c-9cd0-8ad6d930ab73 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.925673] env[60164]: DEBUG nova.compute.provider_tree [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 729.927644] env[60164]: INFO nova.scheduler.client.report [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] Deleted allocations for instance fc85402b-7fcc-4060-b16a-f82d70d6886b [ 729.939738] env[60164]: DEBUG nova.scheduler.client.report [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 729.943193] env[60164]: DEBUG oslo_concurrency.lockutils [None req-2c4217d4-e9e0-46ea-830d-e5ef697e4504 tempest-ServerRescueTestJSONUnderV235-724313637 tempest-ServerRescueTestJSONUnderV235-724313637-project-member] Lock "fc85402b-7fcc-4060-b16a-f82d70d6886b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 15.591s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 729.957806] env[60164]: DEBUG oslo_concurrency.lockutils [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.331s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 729.957806] env[60164]: ERROR nova.compute.manager [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 729.957806] env[60164]: Faults: ['InvalidArgument'] [ 729.957806] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] Traceback (most recent call last): [ 729.957806] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 729.957806] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] self.driver.spawn(context, instance, image_meta, [ 729.957806] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 729.957806] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] self._vmops.spawn(context, instance, image_meta, injected_files, [ 729.957806] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 729.957806] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] self._fetch_image_if_missing(context, vi) [ 729.958573] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 729.958573] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] image_cache(vi, tmp_image_ds_loc) [ 729.958573] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 729.958573] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] vm_util.copy_virtual_disk( [ 729.958573] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 729.958573] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] session._wait_for_task(vmdk_copy_task) [ 729.958573] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 729.958573] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] return self.wait_for_task(task_ref) [ 729.958573] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 729.958573] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] return evt.wait() [ 729.958573] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 729.958573] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] result = hub.switch() [ 729.958573] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 729.959035] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] return self.greenlet.switch() [ 729.959035] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 729.959035] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] self.f(*self.args, **self.kw) [ 729.959035] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 729.959035] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] raise exceptions.translate_fault(task_info.error) [ 729.959035] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 729.959035] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] Faults: ['InvalidArgument'] [ 729.959035] env[60164]: ERROR nova.compute.manager [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] [ 729.959035] env[60164]: DEBUG nova.compute.utils [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] VimFaultException {{(pid=60164) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 729.960574] env[60164]: DEBUG nova.compute.manager [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] Build of instance 7466dfd3-8756-40eb-91fd-c87f16b627ef was re-scheduled: A specified parameter was not correct: fileType [ 729.960574] env[60164]: Faults: ['InvalidArgument'] {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 729.960955] env[60164]: DEBUG nova.compute.manager [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] Unplugging VIFs for instance {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} [ 729.961228] env[60164]: DEBUG oslo_concurrency.lockutils [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Acquiring lock "refresh_cache-7466dfd3-8756-40eb-91fd-c87f16b627ef" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 729.961421] env[60164]: DEBUG oslo_concurrency.lockutils [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Acquired lock "refresh_cache-7466dfd3-8756-40eb-91fd-c87f16b627ef" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 729.961614] env[60164]: DEBUG nova.network.neutron [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 730.029778] env[60164]: DEBUG nova.network.neutron [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 730.223570] env[60164]: DEBUG nova.network.neutron [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 730.235364] env[60164]: DEBUG oslo_concurrency.lockutils [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Releasing lock "refresh_cache-7466dfd3-8756-40eb-91fd-c87f16b627ef" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 730.238464] env[60164]: DEBUG nova.compute.manager [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2984}} [ 730.238464] env[60164]: DEBUG nova.compute.manager [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: 7466dfd3-8756-40eb-91fd-c87f16b627ef] Skipping network deallocation for instance since networking was not requested. {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2260}} [ 730.369006] env[60164]: INFO nova.scheduler.client.report [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Deleted allocations for instance 7466dfd3-8756-40eb-91fd-c87f16b627ef [ 730.379175] env[60164]: ERROR nova.compute.manager [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 40443b08-0a04-4ae9-9f18-468b0fb8d3e6, please check neutron logs for more information. [ 730.379175] env[60164]: ERROR nova.compute.manager Traceback (most recent call last): [ 730.379175] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 730.379175] env[60164]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 730.379175] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 730.379175] env[60164]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 730.379175] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 730.379175] env[60164]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 730.379175] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 730.379175] env[60164]: ERROR nova.compute.manager self.force_reraise() [ 730.379175] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 730.379175] env[60164]: ERROR nova.compute.manager raise self.value [ 730.379175] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 730.379175] env[60164]: ERROR nova.compute.manager updated_port = self._update_port( [ 730.379175] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 730.379175] env[60164]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 730.379901] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 730.379901] env[60164]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 730.379901] env[60164]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 40443b08-0a04-4ae9-9f18-468b0fb8d3e6, please check neutron logs for more information. [ 730.379901] env[60164]: ERROR nova.compute.manager [ 730.379901] env[60164]: Traceback (most recent call last): [ 730.379901] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 730.379901] env[60164]: listener.cb(fileno) [ 730.379901] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 730.379901] env[60164]: result = function(*args, **kwargs) [ 730.379901] env[60164]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 730.379901] env[60164]: return func(*args, **kwargs) [ 730.379901] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 730.379901] env[60164]: raise e [ 730.379901] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 730.379901] env[60164]: nwinfo = self.network_api.allocate_for_instance( [ 730.379901] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 730.379901] env[60164]: created_port_ids = self._update_ports_for_instance( [ 730.379901] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 730.379901] env[60164]: with excutils.save_and_reraise_exception(): [ 730.379901] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 730.379901] env[60164]: self.force_reraise() [ 730.379901] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 730.379901] env[60164]: raise self.value [ 730.379901] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 730.379901] env[60164]: updated_port = self._update_port( [ 730.379901] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 730.379901] env[60164]: _ensure_no_port_binding_failure(port) [ 730.379901] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 730.379901] env[60164]: raise exception.PortBindingFailed(port_id=port['id']) [ 730.380728] env[60164]: nova.exception.PortBindingFailed: Binding failed for port 40443b08-0a04-4ae9-9f18-468b0fb8d3e6, please check neutron logs for more information. [ 730.380728] env[60164]: Removing descriptor: 14 [ 730.380728] env[60164]: ERROR nova.compute.manager [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 40443b08-0a04-4ae9-9f18-468b0fb8d3e6, please check neutron logs for more information. [ 730.380728] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] Traceback (most recent call last): [ 730.380728] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] File "/opt/stack/nova/nova/compute/manager.py", line 2849, in _build_resources [ 730.380728] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] yield resources [ 730.380728] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 730.380728] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] self.driver.spawn(context, instance, image_meta, [ 730.380728] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 730.380728] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] self._vmops.spawn(context, instance, image_meta, injected_files, [ 730.380728] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 730.380728] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] vm_ref = self.build_virtual_machine(instance, [ 730.381075] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 730.381075] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] vif_infos = vmwarevif.get_vif_info(self._session, [ 730.381075] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 730.381075] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] for vif in network_info: [ 730.381075] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 730.381075] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] return self._sync_wrapper(fn, *args, **kwargs) [ 730.381075] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 730.381075] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] self.wait() [ 730.381075] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 730.381075] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] self[:] = self._gt.wait() [ 730.381075] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 730.381075] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] return self._exit_event.wait() [ 730.381075] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 730.381075] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] result = hub.switch() [ 730.381615] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 730.381615] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] return self.greenlet.switch() [ 730.381615] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 730.381615] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] result = function(*args, **kwargs) [ 730.381615] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 730.381615] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] return func(*args, **kwargs) [ 730.381615] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 730.381615] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] raise e [ 730.381615] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 730.381615] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] nwinfo = self.network_api.allocate_for_instance( [ 730.381615] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 730.381615] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] created_port_ids = self._update_ports_for_instance( [ 730.381615] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 730.382190] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] with excutils.save_and_reraise_exception(): [ 730.382190] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 730.382190] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] self.force_reraise() [ 730.382190] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 730.382190] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] raise self.value [ 730.382190] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 730.382190] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] updated_port = self._update_port( [ 730.382190] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 730.382190] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] _ensure_no_port_binding_failure(port) [ 730.382190] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 730.382190] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] raise exception.PortBindingFailed(port_id=port['id']) [ 730.382190] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] nova.exception.PortBindingFailed: Binding failed for port 40443b08-0a04-4ae9-9f18-468b0fb8d3e6, please check neutron logs for more information. [ 730.382190] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] [ 730.382576] env[60164]: INFO nova.compute.manager [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] Terminating instance [ 730.386262] env[60164]: DEBUG oslo_concurrency.lockutils [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Acquiring lock "refresh_cache-47d86b97-4bf1-40d4-b666-a530901d28dd" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 730.386262] env[60164]: DEBUG oslo_concurrency.lockutils [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Acquired lock "refresh_cache-47d86b97-4bf1-40d4-b666-a530901d28dd" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 730.386262] env[60164]: DEBUG nova.network.neutron [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 730.387938] env[60164]: DEBUG oslo_concurrency.lockutils [None req-436b2d56-1260-4986-ae3c-b0d3c111722e tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Lock "7466dfd3-8756-40eb-91fd-c87f16b627ef" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 52.376s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 730.460155] env[60164]: DEBUG nova.network.neutron [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 730.523190] env[60164]: DEBUG oslo_concurrency.lockutils [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] Acquiring lock "35cae673-166d-4ffc-90fb-aee3bdfd1710" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 730.523190] env[60164]: DEBUG oslo_concurrency.lockutils [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] Lock "35cae673-166d-4ffc-90fb-aee3bdfd1710" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 730.534033] env[60164]: DEBUG nova.compute.manager [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] Starting instance... {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} [ 730.585305] env[60164]: DEBUG oslo_concurrency.lockutils [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 730.585610] env[60164]: DEBUG oslo_concurrency.lockutils [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 730.587121] env[60164]: INFO nova.compute.claims [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 730.793382] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48b7caec-64df-4627-b83d-8a148d53df3e {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 730.802100] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8139929-a5b3-4834-bc24-60f5011714bb {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 730.811312] env[60164]: DEBUG nova.network.neutron [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] Successfully created port: 13ca33c8-9dde-4869-a11f-1bd3910b59be {{(pid=60164) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 730.846255] env[60164]: DEBUG nova.network.neutron [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 730.847974] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5a3e5b0-2f00-4806-b022-6a0d3cb83d21 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 730.857140] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-afc1914f-7a9a-4bc2-99b7-ea74a22acb6d {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 730.877078] env[60164]: DEBUG oslo_concurrency.lockutils [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Releasing lock "refresh_cache-47d86b97-4bf1-40d4-b666-a530901d28dd" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 730.877511] env[60164]: DEBUG nova.compute.manager [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] Start destroying the instance on the hypervisor. {{(pid=60164) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} [ 730.877699] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] Destroying instance {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 730.878485] env[60164]: DEBUG nova.compute.provider_tree [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 730.882434] env[60164]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c275849c-b001-40fb-862c-94658bf40f16 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 730.889669] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dea3adfb-361b-4027-a945-dcf21416056b {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 730.908918] env[60164]: DEBUG nova.scheduler.client.report [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 730.925021] env[60164]: WARNING nova.virt.vmwareapi.vmops [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 47d86b97-4bf1-40d4-b666-a530901d28dd could not be found. [ 730.925021] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] Instance destroyed {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 730.925021] env[60164]: INFO nova.compute.manager [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] Took 0.05 seconds to destroy the instance on the hypervisor. [ 730.925021] env[60164]: DEBUG oslo.service.loopingcall [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60164) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 730.925441] env[60164]: DEBUG nova.compute.manager [-] [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 730.925628] env[60164]: DEBUG nova.network.neutron [-] [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 730.933105] env[60164]: DEBUG oslo_concurrency.lockutils [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.347s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 730.941415] env[60164]: DEBUG nova.compute.manager [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] Start building networks asynchronously for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} [ 730.968833] env[60164]: DEBUG nova.network.neutron [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] Successfully created port: 658e7dd4-2de9-447d-8402-096e9544e744 {{(pid=60164) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 730.984737] env[60164]: DEBUG nova.network.neutron [-] [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 730.992349] env[60164]: DEBUG nova.compute.utils [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] Using /dev/sd instead of None {{(pid=60164) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 730.994281] env[60164]: DEBUG nova.compute.manager [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] Allocating IP information in the background. {{(pid=60164) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 730.994389] env[60164]: DEBUG nova.network.neutron [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] allocate_for_instance() {{(pid=60164) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1143}} [ 731.002933] env[60164]: DEBUG nova.network.neutron [-] [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 731.004301] env[60164]: DEBUG nova.compute.manager [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] Start building block device mappings for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} [ 731.018284] env[60164]: INFO nova.compute.manager [-] [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] Took 0.09 seconds to deallocate network for instance. [ 731.022905] env[60164]: DEBUG nova.compute.claims [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] Aborting claim: {{(pid=60164) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 731.022905] env[60164]: DEBUG oslo_concurrency.lockutils [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 731.022905] env[60164]: DEBUG oslo_concurrency.lockutils [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 731.075286] env[60164]: DEBUG nova.compute.manager [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] Start spawning the instance on the hypervisor. {{(pid=60164) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} [ 731.108154] env[60164]: DEBUG nova.virt.hardware [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-15T19:01:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-15T19:01:11Z,direct_url=,disk_format='vmdk',id=1618eb55-f00d-42a5-b978-e81e57855fb4,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='4c19b57a5b764a6480c1f6e2a9cdd5e4',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-15T19:01:12Z,virtual_size=,visibility=), allow threads: False {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 731.108154] env[60164]: DEBUG nova.virt.hardware [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] Flavor limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 731.108154] env[60164]: DEBUG nova.virt.hardware [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] Image limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 731.108360] env[60164]: DEBUG nova.virt.hardware [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] Flavor pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 731.108360] env[60164]: DEBUG nova.virt.hardware [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] Image pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 731.108360] env[60164]: DEBUG nova.virt.hardware [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 731.108360] env[60164]: DEBUG nova.virt.hardware [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 731.108360] env[60164]: DEBUG nova.virt.hardware [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 731.108523] env[60164]: DEBUG nova.virt.hardware [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] Got 1 possible topologies {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 731.108523] env[60164]: DEBUG nova.virt.hardware [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 731.108523] env[60164]: DEBUG nova.virt.hardware [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 731.109397] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01b923c1-3a36-4bcc-a7d9-fb76ddcec571 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.121163] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b0843f3b-da2a-46ed-b124-4f59b98d9ad7 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.215376] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9fd228c2-4aa9-49ed-b08b-dd8c35a9cef5 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.223660] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8191c7f-18a7-407d-9fdf-faf2473a4552 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.267378] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eea1a894-d927-4ab8-aedb-b2d5ba613c1e {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.275132] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe869be1-d80d-469e-8d0e-4ab45a95fdfc {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.293208] env[60164]: DEBUG nova.compute.provider_tree [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 731.307289] env[60164]: DEBUG nova.scheduler.client.report [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 731.330376] env[60164]: DEBUG oslo_concurrency.lockutils [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.309s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 731.332220] env[60164]: ERROR nova.compute.manager [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 40443b08-0a04-4ae9-9f18-468b0fb8d3e6, please check neutron logs for more information. [ 731.332220] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] Traceback (most recent call last): [ 731.332220] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 731.332220] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] self.driver.spawn(context, instance, image_meta, [ 731.332220] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 731.332220] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] self._vmops.spawn(context, instance, image_meta, injected_files, [ 731.332220] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 731.332220] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] vm_ref = self.build_virtual_machine(instance, [ 731.332220] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 731.332220] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] vif_infos = vmwarevif.get_vif_info(self._session, [ 731.332220] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 731.334394] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] for vif in network_info: [ 731.334394] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 731.334394] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] return self._sync_wrapper(fn, *args, **kwargs) [ 731.334394] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 731.334394] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] self.wait() [ 731.334394] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 731.334394] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] self[:] = self._gt.wait() [ 731.334394] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 731.334394] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] return self._exit_event.wait() [ 731.334394] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 731.334394] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] result = hub.switch() [ 731.334394] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 731.334394] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] return self.greenlet.switch() [ 731.334394] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 731.338586] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] result = function(*args, **kwargs) [ 731.338586] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 731.338586] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] return func(*args, **kwargs) [ 731.338586] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 731.338586] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] raise e [ 731.338586] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 731.338586] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] nwinfo = self.network_api.allocate_for_instance( [ 731.338586] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 731.338586] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] created_port_ids = self._update_ports_for_instance( [ 731.338586] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 731.338586] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] with excutils.save_and_reraise_exception(): [ 731.338586] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 731.338586] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] self.force_reraise() [ 731.339187] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 731.339187] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] raise self.value [ 731.339187] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 731.339187] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] updated_port = self._update_port( [ 731.339187] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 731.339187] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] _ensure_no_port_binding_failure(port) [ 731.339187] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 731.339187] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] raise exception.PortBindingFailed(port_id=port['id']) [ 731.339187] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] nova.exception.PortBindingFailed: Binding failed for port 40443b08-0a04-4ae9-9f18-468b0fb8d3e6, please check neutron logs for more information. [ 731.339187] env[60164]: ERROR nova.compute.manager [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] [ 731.339187] env[60164]: DEBUG nova.compute.utils [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] Binding failed for port 40443b08-0a04-4ae9-9f18-468b0fb8d3e6, please check neutron logs for more information. {{(pid=60164) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 731.339525] env[60164]: DEBUG nova.compute.manager [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] Build of instance 47d86b97-4bf1-40d4-b666-a530901d28dd was re-scheduled: Binding failed for port 40443b08-0a04-4ae9-9f18-468b0fb8d3e6, please check neutron logs for more information. {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 731.339525] env[60164]: DEBUG nova.compute.manager [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] Unplugging VIFs for instance {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} [ 731.339525] env[60164]: DEBUG oslo_concurrency.lockutils [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Acquiring lock "refresh_cache-47d86b97-4bf1-40d4-b666-a530901d28dd" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 731.339525] env[60164]: DEBUG oslo_concurrency.lockutils [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Acquired lock "refresh_cache-47d86b97-4bf1-40d4-b666-a530901d28dd" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 731.339699] env[60164]: DEBUG nova.network.neutron [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 731.385754] env[60164]: ERROR nova.compute.manager [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port ca937adf-d12a-4397-bdb1-e9c32bd7d7a4, please check neutron logs for more information. [ 731.385754] env[60164]: ERROR nova.compute.manager Traceback (most recent call last): [ 731.385754] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 731.385754] env[60164]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 731.385754] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 731.385754] env[60164]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 731.385754] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 731.385754] env[60164]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 731.385754] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 731.385754] env[60164]: ERROR nova.compute.manager self.force_reraise() [ 731.385754] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 731.385754] env[60164]: ERROR nova.compute.manager raise self.value [ 731.385754] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 731.385754] env[60164]: ERROR nova.compute.manager updated_port = self._update_port( [ 731.385754] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 731.385754] env[60164]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 731.386332] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 731.386332] env[60164]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 731.386332] env[60164]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port ca937adf-d12a-4397-bdb1-e9c32bd7d7a4, please check neutron logs for more information. [ 731.386332] env[60164]: ERROR nova.compute.manager [ 731.386332] env[60164]: Traceback (most recent call last): [ 731.386332] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 731.386332] env[60164]: listener.cb(fileno) [ 731.386332] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 731.386332] env[60164]: result = function(*args, **kwargs) [ 731.386332] env[60164]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 731.386332] env[60164]: return func(*args, **kwargs) [ 731.386332] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 731.386332] env[60164]: raise e [ 731.386332] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 731.386332] env[60164]: nwinfo = self.network_api.allocate_for_instance( [ 731.386332] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 731.386332] env[60164]: created_port_ids = self._update_ports_for_instance( [ 731.386332] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 731.386332] env[60164]: with excutils.save_and_reraise_exception(): [ 731.386332] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 731.386332] env[60164]: self.force_reraise() [ 731.386332] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 731.386332] env[60164]: raise self.value [ 731.386332] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 731.386332] env[60164]: updated_port = self._update_port( [ 731.386332] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 731.386332] env[60164]: _ensure_no_port_binding_failure(port) [ 731.386332] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 731.386332] env[60164]: raise exception.PortBindingFailed(port_id=port['id']) [ 731.387269] env[60164]: nova.exception.PortBindingFailed: Binding failed for port ca937adf-d12a-4397-bdb1-e9c32bd7d7a4, please check neutron logs for more information. [ 731.387269] env[60164]: Removing descriptor: 17 [ 731.387269] env[60164]: ERROR nova.compute.manager [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port ca937adf-d12a-4397-bdb1-e9c32bd7d7a4, please check neutron logs for more information. [ 731.387269] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] Traceback (most recent call last): [ 731.387269] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] File "/opt/stack/nova/nova/compute/manager.py", line 2849, in _build_resources [ 731.387269] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] yield resources [ 731.387269] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 731.387269] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] self.driver.spawn(context, instance, image_meta, [ 731.387269] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 731.387269] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 731.387269] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 731.387269] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] vm_ref = self.build_virtual_machine(instance, [ 731.387625] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 731.387625] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] vif_infos = vmwarevif.get_vif_info(self._session, [ 731.387625] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 731.387625] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] for vif in network_info: [ 731.387625] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 731.387625] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] return self._sync_wrapper(fn, *args, **kwargs) [ 731.387625] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 731.387625] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] self.wait() [ 731.387625] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 731.387625] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] self[:] = self._gt.wait() [ 731.387625] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 731.387625] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] return self._exit_event.wait() [ 731.387625] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 731.387625] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] result = hub.switch() [ 731.388038] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 731.388038] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] return self.greenlet.switch() [ 731.388038] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 731.388038] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] result = function(*args, **kwargs) [ 731.388038] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 731.388038] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] return func(*args, **kwargs) [ 731.388038] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 731.388038] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] raise e [ 731.388038] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 731.388038] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] nwinfo = self.network_api.allocate_for_instance( [ 731.388038] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 731.388038] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] created_port_ids = self._update_ports_for_instance( [ 731.388038] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 731.388425] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] with excutils.save_and_reraise_exception(): [ 731.388425] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 731.388425] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] self.force_reraise() [ 731.388425] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 731.388425] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] raise self.value [ 731.388425] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 731.388425] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] updated_port = self._update_port( [ 731.388425] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 731.388425] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] _ensure_no_port_binding_failure(port) [ 731.388425] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 731.388425] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] raise exception.PortBindingFailed(port_id=port['id']) [ 731.388425] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] nova.exception.PortBindingFailed: Binding failed for port ca937adf-d12a-4397-bdb1-e9c32bd7d7a4, please check neutron logs for more information. [ 731.388425] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] [ 731.388806] env[60164]: INFO nova.compute.manager [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] Terminating instance [ 731.389727] env[60164]: DEBUG oslo_concurrency.lockutils [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Acquiring lock "refresh_cache-e75afc9c-035c-4926-b72a-d570b5f2e6f0" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 731.389816] env[60164]: DEBUG oslo_concurrency.lockutils [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Acquired lock "refresh_cache-e75afc9c-035c-4926-b72a-d570b5f2e6f0" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 731.389966] env[60164]: DEBUG nova.network.neutron [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 731.415647] env[60164]: DEBUG nova.policy [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bfa6215d0fdf4e45821e1776d7c9e7d0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2d138ee86d5a4657a3c40323d42a362b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60164) authorize /opt/stack/nova/nova/policy.py:203}} [ 731.429123] env[60164]: DEBUG nova.network.neutron [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 731.482230] env[60164]: DEBUG nova.network.neutron [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 731.492242] env[60164]: DEBUG oslo_concurrency.lockutils [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Acquiring lock "946c73f8-1ed8-4180-a9d7-0b2970c4367e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 731.492242] env[60164]: DEBUG oslo_concurrency.lockutils [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Lock "946c73f8-1ed8-4180-a9d7-0b2970c4367e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 731.506877] env[60164]: DEBUG nova.compute.manager [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] Starting instance... {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} [ 731.588252] env[60164]: DEBUG oslo_concurrency.lockutils [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 731.588638] env[60164]: DEBUG oslo_concurrency.lockutils [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 731.590017] env[60164]: INFO nova.compute.claims [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 731.749669] env[60164]: DEBUG oslo_concurrency.lockutils [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Acquiring lock "73b58043-e025-48ff-a22a-4d226c545456" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 731.750230] env[60164]: DEBUG oslo_concurrency.lockutils [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Lock "73b58043-e025-48ff-a22a-4d226c545456" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 731.779289] env[60164]: DEBUG oslo_concurrency.lockutils [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Acquiring lock "031e33fa-92ab-483b-ab38-ecf3bbfd1374" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 731.779289] env[60164]: DEBUG oslo_concurrency.lockutils [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Lock "031e33fa-92ab-483b-ab38-ecf3bbfd1374" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 731.816433] env[60164]: DEBUG nova.network.neutron [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 731.825556] env[60164]: DEBUG oslo_concurrency.lockutils [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Releasing lock "refresh_cache-47d86b97-4bf1-40d4-b666-a530901d28dd" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 731.825959] env[60164]: DEBUG nova.compute.manager [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2984}} [ 731.825959] env[60164]: DEBUG nova.compute.manager [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 731.826130] env[60164]: DEBUG nova.network.neutron [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 731.837325] env[60164]: DEBUG nova.network.neutron [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 731.850473] env[60164]: DEBUG oslo_concurrency.lockutils [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Releasing lock "refresh_cache-e75afc9c-035c-4926-b72a-d570b5f2e6f0" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 731.850924] env[60164]: DEBUG nova.compute.manager [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] Start destroying the instance on the hypervisor. {{(pid=60164) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} [ 731.851337] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] Destroying instance {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 731.851835] env[60164]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-77eb2801-02af-44b7-987b-12e6eebfa973 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.863232] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43c534d3-04d6-42ac-bcb4-ad8ac0d3ef01 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.877385] env[60164]: DEBUG nova.network.neutron [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 731.894936] env[60164]: WARNING nova.virt.vmwareapi.vmops [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance e75afc9c-035c-4926-b72a-d570b5f2e6f0 could not be found. [ 731.895320] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] Instance destroyed {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 731.895320] env[60164]: INFO nova.compute.manager [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] Took 0.04 seconds to destroy the instance on the hypervisor. [ 731.895531] env[60164]: DEBUG oslo.service.loopingcall [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60164) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 731.897059] env[60164]: DEBUG nova.compute.manager [-] [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 731.897192] env[60164]: DEBUG nova.network.neutron [-] [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 731.901823] env[60164]: DEBUG nova.network.neutron [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 731.914621] env[60164]: INFO nova.compute.manager [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: 47d86b97-4bf1-40d4-b666-a530901d28dd] Took 0.09 seconds to deallocate network for instance. [ 731.930421] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89d89964-8bc2-4b89-ac91-3cf2298299a5 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.939018] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39a5132a-4bd9-4f88-96ca-d77f31416d84 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.971951] env[60164]: DEBUG nova.network.neutron [-] [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 731.976325] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-095b24e7-e6f0-48f0-81e3-e5e278e1ae7b {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.984049] env[60164]: DEBUG nova.network.neutron [-] [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 731.987470] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96bad979-4c8e-42b2-8367-a72dfd74ef94 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 732.002388] env[60164]: DEBUG nova.compute.provider_tree [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 732.003571] env[60164]: INFO nova.compute.manager [-] [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] Took 0.11 seconds to deallocate network for instance. [ 732.006033] env[60164]: DEBUG nova.compute.claims [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] Aborting claim: {{(pid=60164) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 732.006033] env[60164]: DEBUG oslo_concurrency.lockutils [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 732.021441] env[60164]: DEBUG nova.scheduler.client.report [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 732.042038] env[60164]: DEBUG oslo_concurrency.lockutils [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.451s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 732.042038] env[60164]: DEBUG nova.compute.manager [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] Start building networks asynchronously for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} [ 732.043610] env[60164]: INFO nova.scheduler.client.report [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Deleted allocations for instance 47d86b97-4bf1-40d4-b666-a530901d28dd [ 732.048601] env[60164]: DEBUG oslo_concurrency.lockutils [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.042s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 732.072943] env[60164]: DEBUG oslo_concurrency.lockutils [None req-351813c6-aeba-4a5e-822b-065981327928 tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Lock "47d86b97-4bf1-40d4-b666-a530901d28dd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 14.197s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 732.103357] env[60164]: DEBUG nova.compute.utils [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Using /dev/sd instead of None {{(pid=60164) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 732.104677] env[60164]: DEBUG nova.compute.manager [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] Not allocating networking since 'none' was specified. {{(pid=60164) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1953}} [ 732.113223] env[60164]: DEBUG nova.compute.manager [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 73b58043-e025-48ff-a22a-4d226c545456] Starting instance... {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} [ 732.123954] env[60164]: DEBUG nova.compute.manager [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] Start building block device mappings for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} [ 732.168770] env[60164]: DEBUG oslo_concurrency.lockutils [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 732.239026] env[60164]: DEBUG nova.compute.manager [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] Start spawning the instance on the hypervisor. {{(pid=60164) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} [ 732.273078] env[60164]: DEBUG nova.virt.hardware [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-15T19:01:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-15T19:01:11Z,direct_url=,disk_format='vmdk',id=1618eb55-f00d-42a5-b978-e81e57855fb4,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='4c19b57a5b764a6480c1f6e2a9cdd5e4',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-15T19:01:12Z,virtual_size=,visibility=), allow threads: False {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 732.273370] env[60164]: DEBUG nova.virt.hardware [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Flavor limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 732.273577] env[60164]: DEBUG nova.virt.hardware [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Image limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 732.273934] env[60164]: DEBUG nova.virt.hardware [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Flavor pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 732.274161] env[60164]: DEBUG nova.virt.hardware [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Image pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 732.274374] env[60164]: DEBUG nova.virt.hardware [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 732.274638] env[60164]: DEBUG nova.virt.hardware [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 732.274918] env[60164]: DEBUG nova.virt.hardware [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 732.275299] env[60164]: DEBUG nova.virt.hardware [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Got 1 possible topologies {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 732.275528] env[60164]: DEBUG nova.virt.hardware [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 732.275847] env[60164]: DEBUG nova.virt.hardware [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 732.278302] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9fd76f44-a3b6-4614-95ff-dc740a6a2ccf {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 732.285410] env[60164]: ERROR nova.compute.manager [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 54daa46a-e3a4-4c3a-80ab-96f92c47ae45, please check neutron logs for more information. [ 732.285410] env[60164]: ERROR nova.compute.manager Traceback (most recent call last): [ 732.285410] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 732.285410] env[60164]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 732.285410] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 732.285410] env[60164]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 732.285410] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 732.285410] env[60164]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 732.285410] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 732.285410] env[60164]: ERROR nova.compute.manager self.force_reraise() [ 732.285410] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 732.285410] env[60164]: ERROR nova.compute.manager raise self.value [ 732.285410] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 732.285410] env[60164]: ERROR nova.compute.manager updated_port = self._update_port( [ 732.285410] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 732.285410] env[60164]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 732.286324] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 732.286324] env[60164]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 732.286324] env[60164]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 54daa46a-e3a4-4c3a-80ab-96f92c47ae45, please check neutron logs for more information. [ 732.286324] env[60164]: ERROR nova.compute.manager [ 732.286324] env[60164]: Traceback (most recent call last): [ 732.286324] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 732.286324] env[60164]: listener.cb(fileno) [ 732.286324] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 732.286324] env[60164]: result = function(*args, **kwargs) [ 732.286324] env[60164]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 732.286324] env[60164]: return func(*args, **kwargs) [ 732.286324] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 732.286324] env[60164]: raise e [ 732.286324] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 732.286324] env[60164]: nwinfo = self.network_api.allocate_for_instance( [ 732.286324] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 732.286324] env[60164]: created_port_ids = self._update_ports_for_instance( [ 732.286324] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 732.286324] env[60164]: with excutils.save_and_reraise_exception(): [ 732.286324] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 732.286324] env[60164]: self.force_reraise() [ 732.286324] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 732.286324] env[60164]: raise self.value [ 732.286324] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 732.286324] env[60164]: updated_port = self._update_port( [ 732.286324] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 732.286324] env[60164]: _ensure_no_port_binding_failure(port) [ 732.286324] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 732.286324] env[60164]: raise exception.PortBindingFailed(port_id=port['id']) [ 732.287231] env[60164]: nova.exception.PortBindingFailed: Binding failed for port 54daa46a-e3a4-4c3a-80ab-96f92c47ae45, please check neutron logs for more information. [ 732.287231] env[60164]: Removing descriptor: 20 [ 732.287231] env[60164]: ERROR nova.compute.manager [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 54daa46a-e3a4-4c3a-80ab-96f92c47ae45, please check neutron logs for more information. [ 732.287231] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] Traceback (most recent call last): [ 732.287231] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] File "/opt/stack/nova/nova/compute/manager.py", line 2849, in _build_resources [ 732.287231] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] yield resources [ 732.287231] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 732.287231] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] self.driver.spawn(context, instance, image_meta, [ 732.287231] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 732.287231] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 732.287231] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 732.287231] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] vm_ref = self.build_virtual_machine(instance, [ 732.287537] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 732.287537] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] vif_infos = vmwarevif.get_vif_info(self._session, [ 732.287537] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 732.287537] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] for vif in network_info: [ 732.287537] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 732.287537] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] return self._sync_wrapper(fn, *args, **kwargs) [ 732.287537] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 732.287537] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] self.wait() [ 732.287537] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 732.287537] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] self[:] = self._gt.wait() [ 732.287537] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 732.287537] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] return self._exit_event.wait() [ 732.287537] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 732.287537] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] result = hub.switch() [ 732.287886] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 732.287886] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] return self.greenlet.switch() [ 732.287886] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 732.287886] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] result = function(*args, **kwargs) [ 732.287886] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 732.287886] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] return func(*args, **kwargs) [ 732.287886] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 732.287886] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] raise e [ 732.287886] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 732.287886] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] nwinfo = self.network_api.allocate_for_instance( [ 732.287886] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 732.287886] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] created_port_ids = self._update_ports_for_instance( [ 732.287886] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 732.288869] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] with excutils.save_and_reraise_exception(): [ 732.288869] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 732.288869] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] self.force_reraise() [ 732.288869] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 732.288869] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] raise self.value [ 732.288869] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 732.288869] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] updated_port = self._update_port( [ 732.288869] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 732.288869] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] _ensure_no_port_binding_failure(port) [ 732.288869] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 732.288869] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] raise exception.PortBindingFailed(port_id=port['id']) [ 732.288869] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] nova.exception.PortBindingFailed: Binding failed for port 54daa46a-e3a4-4c3a-80ab-96f92c47ae45, please check neutron logs for more information. [ 732.288869] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] [ 732.289332] env[60164]: INFO nova.compute.manager [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] Terminating instance [ 732.296450] env[60164]: DEBUG oslo_concurrency.lockutils [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Acquiring lock "refresh_cache-9614d3ee-0911-4b50-9875-93ef3f7f2b5f" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 732.296669] env[60164]: DEBUG oslo_concurrency.lockutils [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Acquired lock "refresh_cache-9614d3ee-0911-4b50-9875-93ef3f7f2b5f" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 732.297404] env[60164]: DEBUG nova.network.neutron [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 732.299248] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d385608-7fa1-42e1-a05b-39764620bf4a {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 732.317819] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] Instance VIF info [] {{(pid=60164) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 732.326512] env[60164]: DEBUG nova.virt.vmwareapi.vm_util [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Creating folder: Project (cc5a77f597ba4db3a043b8962c824544). Parent ref: group-v277790. {{(pid=60164) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 732.332264] env[60164]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b10655d6-c27e-473e-9158-70c25e0e87ec {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 732.348226] env[60164]: INFO nova.virt.vmwareapi.vm_util [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Created folder: Project (cc5a77f597ba4db3a043b8962c824544) in parent group-v277790. [ 732.348226] env[60164]: DEBUG nova.virt.vmwareapi.vm_util [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Creating folder: Instances. Parent ref: group-v277805. {{(pid=60164) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 732.348226] env[60164]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5932e304-ebaf-4111-abb9-cf75895deb29 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 732.359480] env[60164]: INFO nova.virt.vmwareapi.vm_util [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Created folder: Instances in parent group-v277805. [ 732.360726] env[60164]: DEBUG oslo.service.loopingcall [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60164) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 732.367848] env[60164]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] Creating VM on the ESX host {{(pid=60164) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 732.367848] env[60164]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ad32dfc8-6833-413b-bbf0-1ed127198059 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 732.384707] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba4cfed1-c50a-436c-8ccb-d30e49949faf {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 732.390894] env[60164]: DEBUG nova.network.neutron [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 732.395858] env[60164]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 732.395858] env[60164]: value = "task-1295452" [ 732.395858] env[60164]: _type = "Task" [ 732.395858] env[60164]: } to complete. {{(pid=60164) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 732.397891] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e5dc0f0b-c449-46e8-b87d-c8edae48c97d {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 732.412058] env[60164]: DEBUG oslo_vmware.api [-] Task: {'id': task-1295452, 'name': CreateVM_Task} progress is 6%. {{(pid=60164) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 732.439130] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35bce545-f85d-4782-a92a-c7fdafb0b3f1 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 732.447636] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c848bb89-e293-4e60-8192-28c9faa0dd76 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 732.462672] env[60164]: DEBUG nova.compute.provider_tree [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 732.474429] env[60164]: DEBUG nova.scheduler.client.report [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 732.490559] env[60164]: DEBUG oslo_concurrency.lockutils [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.442s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 732.491144] env[60164]: ERROR nova.compute.manager [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port ca937adf-d12a-4397-bdb1-e9c32bd7d7a4, please check neutron logs for more information. [ 732.491144] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] Traceback (most recent call last): [ 732.491144] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 732.491144] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] self.driver.spawn(context, instance, image_meta, [ 732.491144] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 732.491144] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 732.491144] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 732.491144] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] vm_ref = self.build_virtual_machine(instance, [ 732.491144] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 732.491144] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] vif_infos = vmwarevif.get_vif_info(self._session, [ 732.491144] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 732.491524] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] for vif in network_info: [ 732.491524] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 732.491524] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] return self._sync_wrapper(fn, *args, **kwargs) [ 732.491524] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 732.491524] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] self.wait() [ 732.491524] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 732.491524] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] self[:] = self._gt.wait() [ 732.491524] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 732.491524] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] return self._exit_event.wait() [ 732.491524] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 732.491524] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] result = hub.switch() [ 732.491524] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 732.491524] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] return self.greenlet.switch() [ 732.491524] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 732.492011] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] result = function(*args, **kwargs) [ 732.492011] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 732.492011] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] return func(*args, **kwargs) [ 732.492011] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 732.492011] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] raise e [ 732.492011] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 732.492011] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] nwinfo = self.network_api.allocate_for_instance( [ 732.492011] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 732.492011] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] created_port_ids = self._update_ports_for_instance( [ 732.492011] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 732.492011] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] with excutils.save_and_reraise_exception(): [ 732.492011] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 732.492011] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] self.force_reraise() [ 732.492373] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 732.492373] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] raise self.value [ 732.492373] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 732.492373] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] updated_port = self._update_port( [ 732.492373] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 732.492373] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] _ensure_no_port_binding_failure(port) [ 732.492373] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 732.492373] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] raise exception.PortBindingFailed(port_id=port['id']) [ 732.492373] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] nova.exception.PortBindingFailed: Binding failed for port ca937adf-d12a-4397-bdb1-e9c32bd7d7a4, please check neutron logs for more information. [ 732.492373] env[60164]: ERROR nova.compute.manager [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] [ 732.492373] env[60164]: DEBUG nova.compute.utils [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] Binding failed for port ca937adf-d12a-4397-bdb1-e9c32bd7d7a4, please check neutron logs for more information. {{(pid=60164) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 732.493551] env[60164]: DEBUG nova.compute.manager [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] Build of instance e75afc9c-035c-4926-b72a-d570b5f2e6f0 was re-scheduled: Binding failed for port ca937adf-d12a-4397-bdb1-e9c32bd7d7a4, please check neutron logs for more information. {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 732.494045] env[60164]: DEBUG nova.compute.manager [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] Unplugging VIFs for instance {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} [ 732.494287] env[60164]: DEBUG oslo_concurrency.lockutils [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Acquiring lock "refresh_cache-e75afc9c-035c-4926-b72a-d570b5f2e6f0" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 732.494464] env[60164]: DEBUG oslo_concurrency.lockutils [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Acquired lock "refresh_cache-e75afc9c-035c-4926-b72a-d570b5f2e6f0" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 732.494665] env[60164]: DEBUG nova.network.neutron [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 732.496274] env[60164]: DEBUG oslo_concurrency.lockutils [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.328s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 732.497863] env[60164]: INFO nova.compute.claims [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 73b58043-e025-48ff-a22a-4d226c545456] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 732.572652] env[60164]: DEBUG nova.network.neutron [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 732.721398] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6dcdbc76-523a-4777-920f-a21637643e0e {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 732.730046] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32ee653a-b7a7-4642-ac10-e031f2d2d9c0 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.519143] env[60164]: DEBUG nova.network.neutron [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] Successfully created port: 97ab4f54-0a33-4866-8e90-3302cbbff541 {{(pid=60164) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 733.527901] env[60164]: DEBUG nova.compute.manager [req-3d27a6e5-6594-43ce-9afa-f795f318e5e3 req-45d84dd9-6646-4387-bbda-f70c69bfe0a0 service nova] [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] Received event network-changed-54daa46a-e3a4-4c3a-80ab-96f92c47ae45 {{(pid=60164) external_instance_event /opt/stack/nova/nova/compute/manager.py:10979}} [ 733.527901] env[60164]: DEBUG nova.compute.manager [req-3d27a6e5-6594-43ce-9afa-f795f318e5e3 req-45d84dd9-6646-4387-bbda-f70c69bfe0a0 service nova] [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] Refreshing instance network info cache due to event network-changed-54daa46a-e3a4-4c3a-80ab-96f92c47ae45. {{(pid=60164) external_instance_event /opt/stack/nova/nova/compute/manager.py:10984}} [ 733.527901] env[60164]: DEBUG oslo_concurrency.lockutils [req-3d27a6e5-6594-43ce-9afa-f795f318e5e3 req-45d84dd9-6646-4387-bbda-f70c69bfe0a0 service nova] Acquiring lock "refresh_cache-9614d3ee-0911-4b50-9875-93ef3f7f2b5f" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 733.528988] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f9d82af-59cc-4a13-a67d-91f3998ca818 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.538171] env[60164]: DEBUG oslo_vmware.api [-] Task: {'id': task-1295452, 'name': CreateVM_Task, 'duration_secs': 0.256444} completed successfully. {{(pid=60164) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 733.540275] env[60164]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] Created VM on the ESX host {{(pid=60164) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 733.540972] env[60164]: DEBUG oslo_concurrency.lockutils [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/1618eb55-f00d-42a5-b978-e81e57855fb4" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 733.541125] env[60164]: DEBUG oslo_concurrency.lockutils [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Acquired lock "[datastore1] devstack-image-cache_base/1618eb55-f00d-42a5-b978-e81e57855fb4" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 733.541557] env[60164]: DEBUG oslo_concurrency.lockutils [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/1618eb55-f00d-42a5-b978-e81e57855fb4" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 733.542697] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f3bc2b3-b23f-4549-a265-8084a971e4d2 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.546456] env[60164]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0c03fa3a-b459-4beb-90a3-236251626288 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.559480] env[60164]: DEBUG oslo_vmware.api [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Waiting for the task: (returnval){ [ 733.559480] env[60164]: value = "session[528ca5dc-e009-fd53-4682-e6b571cb4de5]52d00874-dacb-5496-829d-51233a8117b9" [ 733.559480] env[60164]: _type = "Task" [ 733.559480] env[60164]: } to complete. {{(pid=60164) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 733.559977] env[60164]: DEBUG nova.compute.provider_tree [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 733.569751] env[60164]: DEBUG oslo_concurrency.lockutils [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Releasing lock "[datastore1] devstack-image-cache_base/1618eb55-f00d-42a5-b978-e81e57855fb4" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 733.569868] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] Processing image 1618eb55-f00d-42a5-b978-e81e57855fb4 {{(pid=60164) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 733.570212] env[60164]: DEBUG oslo_concurrency.lockutils [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/1618eb55-f00d-42a5-b978-e81e57855fb4/1618eb55-f00d-42a5-b978-e81e57855fb4.vmdk" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 733.571642] env[60164]: DEBUG nova.scheduler.client.report [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 733.588483] env[60164]: DEBUG oslo_concurrency.lockutils [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.092s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 733.588794] env[60164]: DEBUG nova.compute.manager [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 73b58043-e025-48ff-a22a-4d226c545456] Start building networks asynchronously for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} [ 733.636046] env[60164]: DEBUG nova.compute.utils [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Using /dev/sd instead of None {{(pid=60164) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 733.637657] env[60164]: DEBUG nova.compute.manager [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 73b58043-e025-48ff-a22a-4d226c545456] Allocating IP information in the background. {{(pid=60164) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 733.637821] env[60164]: DEBUG nova.network.neutron [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 73b58043-e025-48ff-a22a-4d226c545456] allocate_for_instance() {{(pid=60164) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1143}} [ 733.650490] env[60164]: DEBUG nova.compute.manager [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 73b58043-e025-48ff-a22a-4d226c545456] Start building block device mappings for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} [ 733.730666] env[60164]: DEBUG oslo_concurrency.lockutils [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] Acquiring lock "ad70ab2b-17e2-4cf1-9411-272aec5bfb8a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 733.730907] env[60164]: DEBUG oslo_concurrency.lockutils [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] Lock "ad70ab2b-17e2-4cf1-9411-272aec5bfb8a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 733.740904] env[60164]: DEBUG nova.compute.manager [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 73b58043-e025-48ff-a22a-4d226c545456] Start spawning the instance on the hypervisor. {{(pid=60164) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} [ 733.768372] env[60164]: DEBUG nova.virt.hardware [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-15T19:01:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-15T19:01:11Z,direct_url=,disk_format='vmdk',id=1618eb55-f00d-42a5-b978-e81e57855fb4,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='4c19b57a5b764a6480c1f6e2a9cdd5e4',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-15T19:01:12Z,virtual_size=,visibility=), allow threads: False {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 733.769037] env[60164]: DEBUG nova.virt.hardware [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Flavor limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 733.769037] env[60164]: DEBUG nova.virt.hardware [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Image limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 733.769037] env[60164]: DEBUG nova.virt.hardware [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Flavor pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 733.769189] env[60164]: DEBUG nova.virt.hardware [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Image pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 733.769226] env[60164]: DEBUG nova.virt.hardware [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 733.769413] env[60164]: DEBUG nova.virt.hardware [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 733.769593] env[60164]: DEBUG nova.virt.hardware [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 733.770257] env[60164]: DEBUG nova.virt.hardware [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Got 1 possible topologies {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 733.770257] env[60164]: DEBUG nova.virt.hardware [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 733.770257] env[60164]: DEBUG nova.virt.hardware [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 733.770986] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ddc0af7-4542-4018-bde5-53ec1c64d460 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.779888] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1652fb25-af64-4d85-9a2d-2d792d97bb7d {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.829179] env[60164]: DEBUG nova.network.neutron [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 733.844627] env[60164]: DEBUG oslo_concurrency.lockutils [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Releasing lock "refresh_cache-9614d3ee-0911-4b50-9875-93ef3f7f2b5f" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 733.844956] env[60164]: DEBUG nova.compute.manager [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] Start destroying the instance on the hypervisor. {{(pid=60164) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} [ 733.845161] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] Destroying instance {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 733.845473] env[60164]: DEBUG oslo_concurrency.lockutils [req-3d27a6e5-6594-43ce-9afa-f795f318e5e3 req-45d84dd9-6646-4387-bbda-f70c69bfe0a0 service nova] Acquired lock "refresh_cache-9614d3ee-0911-4b50-9875-93ef3f7f2b5f" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 733.845639] env[60164]: DEBUG nova.network.neutron [req-3d27a6e5-6594-43ce-9afa-f795f318e5e3 req-45d84dd9-6646-4387-bbda-f70c69bfe0a0 service nova] [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] Refreshing network info cache for port 54daa46a-e3a4-4c3a-80ab-96f92c47ae45 {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1986}} [ 733.846796] env[60164]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-4a252219-080c-4fe0-8930-69b58ce38ef9 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.855870] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b297a589-92e9-4034-a274-6ac319e4dba3 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.882378] env[60164]: WARNING nova.virt.vmwareapi.vmops [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 9614d3ee-0911-4b50-9875-93ef3f7f2b5f could not be found. [ 733.882606] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] Instance destroyed {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 733.883032] env[60164]: INFO nova.compute.manager [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] Took 0.04 seconds to destroy the instance on the hypervisor. [ 733.883032] env[60164]: DEBUG oslo.service.loopingcall [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60164) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 733.883280] env[60164]: DEBUG nova.compute.manager [-] [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 733.883371] env[60164]: DEBUG nova.network.neutron [-] [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 733.938817] env[60164]: DEBUG nova.network.neutron [req-3d27a6e5-6594-43ce-9afa-f795f318e5e3 req-45d84dd9-6646-4387-bbda-f70c69bfe0a0 service nova] [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 733.957913] env[60164]: DEBUG nova.network.neutron [-] [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 733.965302] env[60164]: DEBUG nova.network.neutron [-] [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 733.976021] env[60164]: INFO nova.compute.manager [-] [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] Took 0.09 seconds to deallocate network for instance. [ 733.976021] env[60164]: DEBUG nova.compute.claims [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] Aborting claim: {{(pid=60164) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 733.976021] env[60164]: DEBUG oslo_concurrency.lockutils [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 733.976241] env[60164]: DEBUG oslo_concurrency.lockutils [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 734.008406] env[60164]: DEBUG nova.network.neutron [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 734.020993] env[60164]: DEBUG oslo_concurrency.lockutils [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Releasing lock "refresh_cache-e75afc9c-035c-4926-b72a-d570b5f2e6f0" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 734.021257] env[60164]: DEBUG nova.compute.manager [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2984}} [ 734.021436] env[60164]: DEBUG nova.compute.manager [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 734.021596] env[60164]: DEBUG nova.network.neutron [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 734.111858] env[60164]: DEBUG nova.network.neutron [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 734.116221] env[60164]: DEBUG nova.policy [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4cc2b0ed84534852a16f9fdd4a8977f7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a98b1fd8031545e381db0682e508fc18', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60164) authorize /opt/stack/nova/nova/policy.py:203}} [ 734.122855] env[60164]: DEBUG nova.network.neutron [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 734.140404] env[60164]: INFO nova.compute.manager [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] [instance: e75afc9c-035c-4926-b72a-d570b5f2e6f0] Took 0.12 seconds to deallocate network for instance. [ 734.225137] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f32d7d0-3db4-4bcf-a559-0521484d0bd3 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 734.237767] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-618c92a1-e7fc-4a47-8495-969124dead5b {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 734.273021] env[60164]: INFO nova.scheduler.client.report [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Deleted allocations for instance e75afc9c-035c-4926-b72a-d570b5f2e6f0 [ 734.281137] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e67279b4-f9e1-4392-967d-e6d101737e93 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 734.293160] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1ab230e-e49e-4aeb-a2f1-687e142a5d85 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 734.307118] env[60164]: DEBUG nova.compute.provider_tree [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 734.312099] env[60164]: DEBUG oslo_concurrency.lockutils [None req-52da69a3-269c-4578-bdbd-d3c315a7f06e tempest-ServersAdminTestJSON-1751613823 tempest-ServersAdminTestJSON-1751613823-project-member] Lock "e75afc9c-035c-4926-b72a-d570b5f2e6f0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 19.330s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 734.318914] env[60164]: DEBUG nova.scheduler.client.report [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 734.330049] env[60164]: DEBUG nova.compute.manager [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] Starting instance... {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} [ 734.331338] env[60164]: DEBUG oslo_concurrency.lockutils [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.355s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 734.331922] env[60164]: ERROR nova.compute.manager [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 54daa46a-e3a4-4c3a-80ab-96f92c47ae45, please check neutron logs for more information. [ 734.331922] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] Traceback (most recent call last): [ 734.331922] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 734.331922] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] self.driver.spawn(context, instance, image_meta, [ 734.331922] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 734.331922] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 734.331922] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 734.331922] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] vm_ref = self.build_virtual_machine(instance, [ 734.331922] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 734.331922] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] vif_infos = vmwarevif.get_vif_info(self._session, [ 734.331922] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 734.332451] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] for vif in network_info: [ 734.332451] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 734.332451] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] return self._sync_wrapper(fn, *args, **kwargs) [ 734.332451] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 734.332451] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] self.wait() [ 734.332451] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 734.332451] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] self[:] = self._gt.wait() [ 734.332451] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 734.332451] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] return self._exit_event.wait() [ 734.332451] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 734.332451] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] result = hub.switch() [ 734.332451] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 734.332451] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] return self.greenlet.switch() [ 734.332451] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 734.333034] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] result = function(*args, **kwargs) [ 734.333034] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 734.333034] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] return func(*args, **kwargs) [ 734.333034] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 734.333034] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] raise e [ 734.333034] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 734.333034] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] nwinfo = self.network_api.allocate_for_instance( [ 734.333034] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 734.333034] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] created_port_ids = self._update_ports_for_instance( [ 734.333034] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 734.333034] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] with excutils.save_and_reraise_exception(): [ 734.333034] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 734.333034] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] self.force_reraise() [ 734.333514] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 734.333514] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] raise self.value [ 734.333514] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 734.333514] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] updated_port = self._update_port( [ 734.333514] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 734.333514] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] _ensure_no_port_binding_failure(port) [ 734.333514] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 734.333514] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] raise exception.PortBindingFailed(port_id=port['id']) [ 734.333514] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] nova.exception.PortBindingFailed: Binding failed for port 54daa46a-e3a4-4c3a-80ab-96f92c47ae45, please check neutron logs for more information. [ 734.333514] env[60164]: ERROR nova.compute.manager [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] [ 734.333514] env[60164]: DEBUG nova.compute.utils [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] Binding failed for port 54daa46a-e3a4-4c3a-80ab-96f92c47ae45, please check neutron logs for more information. {{(pid=60164) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 734.334376] env[60164]: DEBUG nova.compute.manager [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] Build of instance 9614d3ee-0911-4b50-9875-93ef3f7f2b5f was re-scheduled: Binding failed for port 54daa46a-e3a4-4c3a-80ab-96f92c47ae45, please check neutron logs for more information. {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 734.334555] env[60164]: DEBUG nova.compute.manager [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] Unplugging VIFs for instance {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} [ 734.334676] env[60164]: DEBUG oslo_concurrency.lockutils [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Acquiring lock "refresh_cache-9614d3ee-0911-4b50-9875-93ef3f7f2b5f" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 734.379118] env[60164]: DEBUG oslo_concurrency.lockutils [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 734.379374] env[60164]: DEBUG oslo_concurrency.lockutils [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 734.380919] env[60164]: INFO nova.compute.claims [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 734.621405] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ea37d78-4005-430a-b35a-b6182edfe333 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 734.629796] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f3fe752-7120-46f5-ae9a-e6580def0682 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 734.662515] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af3d8462-8e52-4256-9160-315e089c5d6b {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 734.670154] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31c2667a-ab40-4906-bf9f-72ea7c8901d9 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 734.678044] env[60164]: DEBUG nova.network.neutron [req-3d27a6e5-6594-43ce-9afa-f795f318e5e3 req-45d84dd9-6646-4387-bbda-f70c69bfe0a0 service nova] [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 734.687635] env[60164]: DEBUG nova.compute.provider_tree [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 734.690337] env[60164]: DEBUG oslo_concurrency.lockutils [req-3d27a6e5-6594-43ce-9afa-f795f318e5e3 req-45d84dd9-6646-4387-bbda-f70c69bfe0a0 service nova] Releasing lock "refresh_cache-9614d3ee-0911-4b50-9875-93ef3f7f2b5f" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 734.690815] env[60164]: DEBUG oslo_concurrency.lockutils [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Acquired lock "refresh_cache-9614d3ee-0911-4b50-9875-93ef3f7f2b5f" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 734.690993] env[60164]: DEBUG nova.network.neutron [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 734.701757] env[60164]: DEBUG nova.scheduler.client.report [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 734.717714] env[60164]: DEBUG oslo_concurrency.lockutils [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.338s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 734.718331] env[60164]: DEBUG nova.compute.manager [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] Start building networks asynchronously for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} [ 734.757726] env[60164]: DEBUG nova.compute.utils [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Using /dev/sd instead of None {{(pid=60164) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 734.763021] env[60164]: DEBUG nova.compute.manager [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] Allocating IP information in the background. {{(pid=60164) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 734.763021] env[60164]: DEBUG nova.network.neutron [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] allocate_for_instance() {{(pid=60164) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1143}} [ 734.770516] env[60164]: DEBUG nova.compute.manager [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] Start building block device mappings for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} [ 734.775754] env[60164]: DEBUG nova.network.neutron [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 734.851900] env[60164]: DEBUG nova.compute.manager [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] Start spawning the instance on the hypervisor. {{(pid=60164) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} [ 734.881084] env[60164]: DEBUG nova.virt.hardware [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-15T19:01:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-15T19:01:11Z,direct_url=,disk_format='vmdk',id=1618eb55-f00d-42a5-b978-e81e57855fb4,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='4c19b57a5b764a6480c1f6e2a9cdd5e4',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-15T19:01:12Z,virtual_size=,visibility=), allow threads: False {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 734.881474] env[60164]: DEBUG nova.virt.hardware [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Flavor limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 734.881712] env[60164]: DEBUG nova.virt.hardware [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Image limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 734.881915] env[60164]: DEBUG nova.virt.hardware [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Flavor pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 734.882341] env[60164]: DEBUG nova.virt.hardware [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Image pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 734.882341] env[60164]: DEBUG nova.virt.hardware [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 734.882437] env[60164]: DEBUG nova.virt.hardware [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 734.882576] env[60164]: DEBUG nova.virt.hardware [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 734.882745] env[60164]: DEBUG nova.virt.hardware [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Got 1 possible topologies {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 734.882905] env[60164]: DEBUG nova.virt.hardware [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 734.883137] env[60164]: DEBUG nova.virt.hardware [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 734.883947] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c99e05a-599e-45e0-b943-3009894af63e {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 734.893978] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5ec018e-c56d-44c6-bef1-bd9f6d26f8b8 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 734.977031] env[60164]: DEBUG nova.policy [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4cc2b0ed84534852a16f9fdd4a8977f7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a98b1fd8031545e381db0682e508fc18', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60164) authorize /opt/stack/nova/nova/policy.py:203}} [ 735.219269] env[60164]: DEBUG nova.network.neutron [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 735.235822] env[60164]: DEBUG oslo_concurrency.lockutils [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Releasing lock "refresh_cache-9614d3ee-0911-4b50-9875-93ef3f7f2b5f" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 735.236793] env[60164]: DEBUG nova.compute.manager [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2984}} [ 735.237204] env[60164]: DEBUG nova.compute.manager [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 735.237626] env[60164]: DEBUG nova.network.neutron [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 735.301269] env[60164]: DEBUG nova.network.neutron [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 735.310840] env[60164]: DEBUG nova.network.neutron [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 735.322193] env[60164]: INFO nova.compute.manager [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] [instance: 9614d3ee-0911-4b50-9875-93ef3f7f2b5f] Took 0.08 seconds to deallocate network for instance. [ 735.423563] env[60164]: INFO nova.scheduler.client.report [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Deleted allocations for instance 9614d3ee-0911-4b50-9875-93ef3f7f2b5f [ 735.441328] env[60164]: DEBUG oslo_concurrency.lockutils [None req-9d85b446-e533-4385-85de-1ac0d0f19f0b tempest-AttachInterfacesTestJSON-1051727072 tempest-AttachInterfacesTestJSON-1051727072-project-member] Lock "9614d3ee-0911-4b50-9875-93ef3f7f2b5f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 12.196s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 735.468568] env[60164]: DEBUG nova.compute.manager [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] Starting instance... {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} [ 735.529202] env[60164]: DEBUG oslo_concurrency.lockutils [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 735.529537] env[60164]: DEBUG oslo_concurrency.lockutils [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 735.531414] env[60164]: INFO nova.compute.claims [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 735.783465] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b2afabe-fcf8-40f2-9c61-fd9901254e96 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.792467] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6132589-4016-471c-bf53-a26387d44e8f {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.826387] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4206bf66-03b1-4f62-8119-c0160d2c3cb5 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.832542] env[60164]: DEBUG nova.network.neutron [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 73b58043-e025-48ff-a22a-4d226c545456] Successfully created port: 287f3ea7-5f8f-462d-b0a4-2bb5a46b7eed {{(pid=60164) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 735.840039] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97d26083-b271-4d2a-98d6-2cd10b653e0a {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.853203] env[60164]: DEBUG nova.compute.provider_tree [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 735.878267] env[60164]: DEBUG nova.scheduler.client.report [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 735.906846] env[60164]: DEBUG oslo_concurrency.lockutils [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.376s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 735.907657] env[60164]: DEBUG nova.compute.manager [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] Start building networks asynchronously for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} [ 735.959048] env[60164]: DEBUG nova.compute.utils [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] Using /dev/sd instead of None {{(pid=60164) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 735.960498] env[60164]: DEBUG nova.compute.manager [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] Allocating IP information in the background. {{(pid=60164) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 735.960687] env[60164]: DEBUG nova.network.neutron [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] allocate_for_instance() {{(pid=60164) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1143}} [ 735.975855] env[60164]: DEBUG nova.compute.manager [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] Start building block device mappings for instance. {{(pid=60164) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} [ 736.074099] env[60164]: DEBUG nova.compute.manager [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] Start spawning the instance on the hypervisor. {{(pid=60164) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} [ 736.104388] env[60164]: DEBUG nova.virt.hardware [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-15T19:01:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-15T19:01:11Z,direct_url=,disk_format='vmdk',id=1618eb55-f00d-42a5-b978-e81e57855fb4,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='4c19b57a5b764a6480c1f6e2a9cdd5e4',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-15T19:01:12Z,virtual_size=,visibility=), allow threads: False {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 736.104612] env[60164]: DEBUG nova.virt.hardware [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] Flavor limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 736.104786] env[60164]: DEBUG nova.virt.hardware [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] Image limits 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 736.104987] env[60164]: DEBUG nova.virt.hardware [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] Flavor pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 736.105760] env[60164]: DEBUG nova.virt.hardware [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] Image pref 0:0:0 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 736.105760] env[60164]: DEBUG nova.virt.hardware [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60164) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 736.105892] env[60164]: DEBUG nova.virt.hardware [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 736.105991] env[60164]: DEBUG nova.virt.hardware [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 736.106171] env[60164]: DEBUG nova.virt.hardware [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] Got 1 possible topologies {{(pid=60164) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 736.106331] env[60164]: DEBUG nova.virt.hardware [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 736.106498] env[60164]: DEBUG nova.virt.hardware [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60164) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 736.109738] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8cd57db-b2c0-44a9-b84b-768d085682b1 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 736.118904] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-532ab1c4-8862-4993-8c4e-eaa1e789b27d {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 736.368318] env[60164]: DEBUG nova.policy [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6e12895c63134d80bf3f4f545baf2554', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bd9480d686364ebab0d87bd617f302fa', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60164) authorize /opt/stack/nova/nova/policy.py:203}} [ 737.018875] env[60164]: DEBUG nova.network.neutron [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] Successfully created port: 521c92d4-0858-4e8f-a8a6-4774cc3624ae {{(pid=60164) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 737.918122] env[60164]: ERROR nova.compute.manager [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 13ca33c8-9dde-4869-a11f-1bd3910b59be, please check neutron logs for more information. [ 737.918122] env[60164]: ERROR nova.compute.manager Traceback (most recent call last): [ 737.918122] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 737.918122] env[60164]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 737.918122] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 737.918122] env[60164]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 737.918122] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 737.918122] env[60164]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 737.918122] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 737.918122] env[60164]: ERROR nova.compute.manager self.force_reraise() [ 737.918122] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 737.918122] env[60164]: ERROR nova.compute.manager raise self.value [ 737.918122] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 737.918122] env[60164]: ERROR nova.compute.manager updated_port = self._update_port( [ 737.918122] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 737.918122] env[60164]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 737.918678] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 737.918678] env[60164]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 737.918678] env[60164]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 13ca33c8-9dde-4869-a11f-1bd3910b59be, please check neutron logs for more information. [ 737.918678] env[60164]: ERROR nova.compute.manager [ 737.918678] env[60164]: Traceback (most recent call last): [ 737.918678] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 737.918678] env[60164]: listener.cb(fileno) [ 737.918678] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 737.918678] env[60164]: result = function(*args, **kwargs) [ 737.918678] env[60164]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 737.918678] env[60164]: return func(*args, **kwargs) [ 737.918678] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 737.918678] env[60164]: raise e [ 737.918678] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 737.918678] env[60164]: nwinfo = self.network_api.allocate_for_instance( [ 737.918678] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 737.918678] env[60164]: created_port_ids = self._update_ports_for_instance( [ 737.918678] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 737.918678] env[60164]: with excutils.save_and_reraise_exception(): [ 737.918678] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 737.918678] env[60164]: self.force_reraise() [ 737.918678] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 737.918678] env[60164]: raise self.value [ 737.918678] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 737.918678] env[60164]: updated_port = self._update_port( [ 737.918678] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 737.918678] env[60164]: _ensure_no_port_binding_failure(port) [ 737.918678] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 737.918678] env[60164]: raise exception.PortBindingFailed(port_id=port['id']) [ 737.921441] env[60164]: nova.exception.PortBindingFailed: Binding failed for port 13ca33c8-9dde-4869-a11f-1bd3910b59be, please check neutron logs for more information. [ 737.921441] env[60164]: Removing descriptor: 12 [ 737.921441] env[60164]: ERROR nova.compute.manager [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 13ca33c8-9dde-4869-a11f-1bd3910b59be, please check neutron logs for more information. [ 737.921441] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] Traceback (most recent call last): [ 737.921441] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] File "/opt/stack/nova/nova/compute/manager.py", line 2849, in _build_resources [ 737.921441] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] yield resources [ 737.921441] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 737.921441] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] self.driver.spawn(context, instance, image_meta, [ 737.921441] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 737.921441] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] self._vmops.spawn(context, instance, image_meta, injected_files, [ 737.921441] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 737.921441] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] vm_ref = self.build_virtual_machine(instance, [ 737.921769] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 737.921769] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] vif_infos = vmwarevif.get_vif_info(self._session, [ 737.921769] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 737.921769] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] for vif in network_info: [ 737.921769] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 737.921769] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] return self._sync_wrapper(fn, *args, **kwargs) [ 737.921769] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 737.921769] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] self.wait() [ 737.921769] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 737.921769] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] self[:] = self._gt.wait() [ 737.921769] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 737.921769] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] return self._exit_event.wait() [ 737.921769] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 737.921769] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] result = hub.switch() [ 737.922134] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 737.922134] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] return self.greenlet.switch() [ 737.922134] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 737.922134] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] result = function(*args, **kwargs) [ 737.922134] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 737.922134] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] return func(*args, **kwargs) [ 737.922134] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 737.922134] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] raise e [ 737.922134] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 737.922134] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] nwinfo = self.network_api.allocate_for_instance( [ 737.922134] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 737.922134] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] created_port_ids = self._update_ports_for_instance( [ 737.922134] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 737.922506] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] with excutils.save_and_reraise_exception(): [ 737.922506] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 737.922506] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] self.force_reraise() [ 737.922506] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 737.922506] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] raise self.value [ 737.922506] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 737.922506] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] updated_port = self._update_port( [ 737.922506] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 737.922506] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] _ensure_no_port_binding_failure(port) [ 737.922506] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 737.922506] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] raise exception.PortBindingFailed(port_id=port['id']) [ 737.922506] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] nova.exception.PortBindingFailed: Binding failed for port 13ca33c8-9dde-4869-a11f-1bd3910b59be, please check neutron logs for more information. [ 737.922506] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] [ 737.922909] env[60164]: INFO nova.compute.manager [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] Terminating instance [ 737.923822] env[60164]: DEBUG oslo_concurrency.lockutils [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] Acquiring lock "refresh_cache-ab6859e4-807d-4b5f-943b-6491ed211c75" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 737.923993] env[60164]: DEBUG oslo_concurrency.lockutils [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] Acquired lock "refresh_cache-ab6859e4-807d-4b5f-943b-6491ed211c75" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 737.924175] env[60164]: DEBUG nova.network.neutron [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 738.199187] env[60164]: DEBUG nova.network.neutron [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 738.308211] env[60164]: ERROR nova.compute.manager [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 658e7dd4-2de9-447d-8402-096e9544e744, please check neutron logs for more information. [ 738.308211] env[60164]: ERROR nova.compute.manager Traceback (most recent call last): [ 738.308211] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 738.308211] env[60164]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 738.308211] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 738.308211] env[60164]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 738.308211] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 738.308211] env[60164]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 738.308211] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 738.308211] env[60164]: ERROR nova.compute.manager self.force_reraise() [ 738.308211] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 738.308211] env[60164]: ERROR nova.compute.manager raise self.value [ 738.308211] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 738.308211] env[60164]: ERROR nova.compute.manager updated_port = self._update_port( [ 738.308211] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 738.308211] env[60164]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 738.308645] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 738.308645] env[60164]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 738.308645] env[60164]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 658e7dd4-2de9-447d-8402-096e9544e744, please check neutron logs for more information. [ 738.308645] env[60164]: ERROR nova.compute.manager [ 738.308645] env[60164]: Traceback (most recent call last): [ 738.308645] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 738.308645] env[60164]: listener.cb(fileno) [ 738.308645] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 738.308645] env[60164]: result = function(*args, **kwargs) [ 738.308645] env[60164]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 738.308645] env[60164]: return func(*args, **kwargs) [ 738.308645] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 738.308645] env[60164]: raise e [ 738.308645] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 738.308645] env[60164]: nwinfo = self.network_api.allocate_for_instance( [ 738.308645] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 738.308645] env[60164]: created_port_ids = self._update_ports_for_instance( [ 738.308645] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 738.308645] env[60164]: with excutils.save_and_reraise_exception(): [ 738.308645] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 738.308645] env[60164]: self.force_reraise() [ 738.308645] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 738.308645] env[60164]: raise self.value [ 738.308645] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 738.308645] env[60164]: updated_port = self._update_port( [ 738.308645] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 738.308645] env[60164]: _ensure_no_port_binding_failure(port) [ 738.308645] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 738.308645] env[60164]: raise exception.PortBindingFailed(port_id=port['id']) [ 738.309400] env[60164]: nova.exception.PortBindingFailed: Binding failed for port 658e7dd4-2de9-447d-8402-096e9544e744, please check neutron logs for more information. [ 738.309400] env[60164]: Removing descriptor: 19 [ 738.309400] env[60164]: ERROR nova.compute.manager [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 658e7dd4-2de9-447d-8402-096e9544e744, please check neutron logs for more information. [ 738.309400] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] Traceback (most recent call last): [ 738.309400] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] File "/opt/stack/nova/nova/compute/manager.py", line 2849, in _build_resources [ 738.309400] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] yield resources [ 738.309400] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 738.309400] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] self.driver.spawn(context, instance, image_meta, [ 738.309400] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 738.309400] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 738.309400] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 738.309400] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] vm_ref = self.build_virtual_machine(instance, [ 738.309926] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 738.309926] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] vif_infos = vmwarevif.get_vif_info(self._session, [ 738.309926] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 738.309926] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] for vif in network_info: [ 738.309926] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 738.309926] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] return self._sync_wrapper(fn, *args, **kwargs) [ 738.309926] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 738.309926] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] self.wait() [ 738.309926] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 738.309926] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] self[:] = self._gt.wait() [ 738.309926] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 738.309926] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] return self._exit_event.wait() [ 738.309926] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 738.309926] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] result = hub.switch() [ 738.310318] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 738.310318] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] return self.greenlet.switch() [ 738.310318] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 738.310318] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] result = function(*args, **kwargs) [ 738.310318] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 738.310318] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] return func(*args, **kwargs) [ 738.310318] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 738.310318] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] raise e [ 738.310318] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 738.310318] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] nwinfo = self.network_api.allocate_for_instance( [ 738.310318] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 738.310318] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] created_port_ids = self._update_ports_for_instance( [ 738.310318] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 738.310683] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] with excutils.save_and_reraise_exception(): [ 738.310683] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 738.310683] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] self.force_reraise() [ 738.310683] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 738.310683] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] raise self.value [ 738.310683] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 738.310683] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] updated_port = self._update_port( [ 738.310683] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 738.310683] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] _ensure_no_port_binding_failure(port) [ 738.310683] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 738.310683] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] raise exception.PortBindingFailed(port_id=port['id']) [ 738.310683] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] nova.exception.PortBindingFailed: Binding failed for port 658e7dd4-2de9-447d-8402-096e9544e744, please check neutron logs for more information. [ 738.310683] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] [ 738.311784] env[60164]: INFO nova.compute.manager [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] Terminating instance [ 738.312692] env[60164]: DEBUG oslo_concurrency.lockutils [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] Acquiring lock "refresh_cache-bd447698-8d52-4576-9d86-1a22e36bc3d5" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 738.312939] env[60164]: DEBUG oslo_concurrency.lockutils [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] Acquired lock "refresh_cache-bd447698-8d52-4576-9d86-1a22e36bc3d5" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 738.313031] env[60164]: DEBUG nova.network.neutron [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 738.426718] env[60164]: DEBUG nova.network.neutron [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 738.948608] env[60164]: DEBUG nova.network.neutron [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] Successfully created port: 0b909c60-aba4-4d0b-8134-93a9bbbab5da {{(pid=60164) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 738.977371] env[60164]: DEBUG nova.network.neutron [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 738.988463] env[60164]: DEBUG oslo_concurrency.lockutils [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] Releasing lock "refresh_cache-ab6859e4-807d-4b5f-943b-6491ed211c75" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 738.988881] env[60164]: DEBUG nova.compute.manager [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] Start destroying the instance on the hypervisor. {{(pid=60164) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} [ 738.989158] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] Destroying instance {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 738.989652] env[60164]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-1283f6b7-a79a-4335-9012-6285b7744ac8 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.003261] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b92ca2d5-8c30-4e9d-85ef-077b72cd1c4c {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.033751] env[60164]: WARNING nova.virt.vmwareapi.vmops [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ab6859e4-807d-4b5f-943b-6491ed211c75 could not be found. [ 739.034006] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] Instance destroyed {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 739.034204] env[60164]: INFO nova.compute.manager [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] Took 0.05 seconds to destroy the instance on the hypervisor. [ 739.034452] env[60164]: DEBUG oslo.service.loopingcall [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60164) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 739.034685] env[60164]: DEBUG nova.compute.manager [-] [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 739.034783] env[60164]: DEBUG nova.network.neutron [-] [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 739.291156] env[60164]: DEBUG nova.network.neutron [-] [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 739.309347] env[60164]: DEBUG nova.network.neutron [-] [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 739.319911] env[60164]: INFO nova.compute.manager [-] [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] Took 0.28 seconds to deallocate network for instance. [ 739.327247] env[60164]: DEBUG nova.compute.claims [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] Aborting claim: {{(pid=60164) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 739.328100] env[60164]: DEBUG oslo_concurrency.lockutils [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 739.328654] env[60164]: DEBUG oslo_concurrency.lockutils [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 739.347646] env[60164]: DEBUG nova.compute.manager [req-3bdfab69-1efd-4be5-90fc-633b961f2cb6 req-9f2fd91c-2466-42f0-8628-9e42a8208498 service nova] [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] Received event network-changed-13ca33c8-9dde-4869-a11f-1bd3910b59be {{(pid=60164) external_instance_event /opt/stack/nova/nova/compute/manager.py:10979}} [ 739.347837] env[60164]: DEBUG nova.compute.manager [req-3bdfab69-1efd-4be5-90fc-633b961f2cb6 req-9f2fd91c-2466-42f0-8628-9e42a8208498 service nova] [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] Refreshing instance network info cache due to event network-changed-13ca33c8-9dde-4869-a11f-1bd3910b59be. {{(pid=60164) external_instance_event /opt/stack/nova/nova/compute/manager.py:10984}} [ 739.352020] env[60164]: DEBUG oslo_concurrency.lockutils [req-3bdfab69-1efd-4be5-90fc-633b961f2cb6 req-9f2fd91c-2466-42f0-8628-9e42a8208498 service nova] Acquiring lock "refresh_cache-ab6859e4-807d-4b5f-943b-6491ed211c75" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 739.352020] env[60164]: DEBUG oslo_concurrency.lockutils [req-3bdfab69-1efd-4be5-90fc-633b961f2cb6 req-9f2fd91c-2466-42f0-8628-9e42a8208498 service nova] Acquired lock "refresh_cache-ab6859e4-807d-4b5f-943b-6491ed211c75" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 739.352020] env[60164]: DEBUG nova.network.neutron [req-3bdfab69-1efd-4be5-90fc-633b961f2cb6 req-9f2fd91c-2466-42f0-8628-9e42a8208498 service nova] [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] Refreshing network info cache for port 13ca33c8-9dde-4869-a11f-1bd3910b59be {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1986}} [ 739.354695] env[60164]: DEBUG nova.network.neutron [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 739.367607] env[60164]: DEBUG oslo_concurrency.lockutils [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] Releasing lock "refresh_cache-bd447698-8d52-4576-9d86-1a22e36bc3d5" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 739.367822] env[60164]: DEBUG nova.compute.manager [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] Start destroying the instance on the hypervisor. {{(pid=60164) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} [ 739.368155] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] Destroying instance {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 739.368536] env[60164]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-6fbd7405-1194-47a1-a511-c81f1d6fd2df {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.383568] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc382d28-6052-485d-9a85-5c915febb368 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.409765] env[60164]: WARNING nova.virt.vmwareapi.vmops [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance bd447698-8d52-4576-9d86-1a22e36bc3d5 could not be found. [ 739.410146] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] Instance destroyed {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 739.410294] env[60164]: INFO nova.compute.manager [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] Took 0.04 seconds to destroy the instance on the hypervisor. [ 739.412309] env[60164]: DEBUG oslo.service.loopingcall [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60164) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 739.412309] env[60164]: DEBUG nova.compute.manager [-] [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 739.412309] env[60164]: DEBUG nova.network.neutron [-] [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 739.463514] env[60164]: DEBUG nova.network.neutron [req-3bdfab69-1efd-4be5-90fc-633b961f2cb6 req-9f2fd91c-2466-42f0-8628-9e42a8208498 service nova] [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 739.503385] env[60164]: DEBUG nova.network.neutron [-] [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 739.515989] env[60164]: DEBUG nova.network.neutron [-] [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 739.529154] env[60164]: INFO nova.compute.manager [-] [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] Took 0.12 seconds to deallocate network for instance. [ 739.534038] env[60164]: DEBUG nova.compute.claims [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] Aborting claim: {{(pid=60164) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 739.535706] env[60164]: DEBUG oslo_concurrency.lockutils [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 739.561186] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94ac2136-5548-402c-835c-8cdf8e0fa0da {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.569780] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60342648-76af-477f-b33b-e614777e7afc {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.608567] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8a6d354-2db9-48f7-a2e5-a0e3ca534e99 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.617053] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9dda11d8-f75e-48cb-8fc2-0d83d10a732b {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.633480] env[60164]: DEBUG nova.compute.provider_tree [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 739.649879] env[60164]: DEBUG nova.scheduler.client.report [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 739.673835] env[60164]: DEBUG oslo_concurrency.lockutils [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.345s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 739.675399] env[60164]: ERROR nova.compute.manager [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 13ca33c8-9dde-4869-a11f-1bd3910b59be, please check neutron logs for more information. [ 739.675399] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] Traceback (most recent call last): [ 739.675399] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 739.675399] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] self.driver.spawn(context, instance, image_meta, [ 739.675399] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 739.675399] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] self._vmops.spawn(context, instance, image_meta, injected_files, [ 739.675399] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 739.675399] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] vm_ref = self.build_virtual_machine(instance, [ 739.675399] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 739.675399] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] vif_infos = vmwarevif.get_vif_info(self._session, [ 739.675399] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 739.675726] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] for vif in network_info: [ 739.675726] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 739.675726] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] return self._sync_wrapper(fn, *args, **kwargs) [ 739.675726] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 739.675726] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] self.wait() [ 739.675726] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 739.675726] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] self[:] = self._gt.wait() [ 739.675726] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 739.675726] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] return self._exit_event.wait() [ 739.675726] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 739.675726] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] result = hub.switch() [ 739.675726] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 739.675726] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] return self.greenlet.switch() [ 739.675726] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 739.676135] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] result = function(*args, **kwargs) [ 739.676135] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 739.676135] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] return func(*args, **kwargs) [ 739.676135] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 739.676135] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] raise e [ 739.676135] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 739.676135] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] nwinfo = self.network_api.allocate_for_instance( [ 739.676135] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 739.676135] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] created_port_ids = self._update_ports_for_instance( [ 739.676135] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 739.676135] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] with excutils.save_and_reraise_exception(): [ 739.676135] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 739.676135] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] self.force_reraise() [ 739.676546] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 739.676546] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] raise self.value [ 739.676546] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 739.676546] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] updated_port = self._update_port( [ 739.676546] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 739.676546] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] _ensure_no_port_binding_failure(port) [ 739.676546] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 739.676546] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] raise exception.PortBindingFailed(port_id=port['id']) [ 739.676546] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] nova.exception.PortBindingFailed: Binding failed for port 13ca33c8-9dde-4869-a11f-1bd3910b59be, please check neutron logs for more information. [ 739.676546] env[60164]: ERROR nova.compute.manager [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] [ 739.676546] env[60164]: DEBUG nova.compute.utils [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] Binding failed for port 13ca33c8-9dde-4869-a11f-1bd3910b59be, please check neutron logs for more information. {{(pid=60164) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 739.676856] env[60164]: DEBUG oslo_concurrency.lockutils [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.142s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 739.680760] env[60164]: DEBUG nova.compute.manager [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] Build of instance ab6859e4-807d-4b5f-943b-6491ed211c75 was re-scheduled: Binding failed for port 13ca33c8-9dde-4869-a11f-1bd3910b59be, please check neutron logs for more information. {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 739.680760] env[60164]: DEBUG nova.compute.manager [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] Unplugging VIFs for instance {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} [ 739.680760] env[60164]: DEBUG oslo_concurrency.lockutils [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] Acquiring lock "refresh_cache-ab6859e4-807d-4b5f-943b-6491ed211c75" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 739.905033] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da819391-c3eb-48b3-8f90-aa5184665966 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.912673] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c7a3bb5-0cee-4dad-9e5f-6818f88a2a03 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.947832] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e340326-fec7-40e2-a13e-6dad2c68f884 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.956178] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e1c12ff-2a94-466e-bb43-aaa1237333fd {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.973654] env[60164]: DEBUG nova.compute.provider_tree [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 739.982987] env[60164]: DEBUG nova.scheduler.client.report [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 740.003755] env[60164]: DEBUG oslo_concurrency.lockutils [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.327s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 740.006721] env[60164]: ERROR nova.compute.manager [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 658e7dd4-2de9-447d-8402-096e9544e744, please check neutron logs for more information. [ 740.006721] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] Traceback (most recent call last): [ 740.006721] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 740.006721] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] self.driver.spawn(context, instance, image_meta, [ 740.006721] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 740.006721] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 740.006721] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 740.006721] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] vm_ref = self.build_virtual_machine(instance, [ 740.006721] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 740.006721] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] vif_infos = vmwarevif.get_vif_info(self._session, [ 740.006721] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 740.007181] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] for vif in network_info: [ 740.007181] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 740.007181] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] return self._sync_wrapper(fn, *args, **kwargs) [ 740.007181] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 740.007181] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] self.wait() [ 740.007181] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 740.007181] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] self[:] = self._gt.wait() [ 740.007181] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 740.007181] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] return self._exit_event.wait() [ 740.007181] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 740.007181] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] result = hub.switch() [ 740.007181] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 740.007181] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] return self.greenlet.switch() [ 740.007181] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 740.007587] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] result = function(*args, **kwargs) [ 740.007587] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 740.007587] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] return func(*args, **kwargs) [ 740.007587] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 740.007587] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] raise e [ 740.007587] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 740.007587] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] nwinfo = self.network_api.allocate_for_instance( [ 740.007587] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 740.007587] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] created_port_ids = self._update_ports_for_instance( [ 740.007587] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 740.007587] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] with excutils.save_and_reraise_exception(): [ 740.007587] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 740.007587] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] self.force_reraise() [ 740.007943] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 740.007943] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] raise self.value [ 740.007943] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 740.007943] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] updated_port = self._update_port( [ 740.007943] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 740.007943] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] _ensure_no_port_binding_failure(port) [ 740.007943] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 740.007943] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] raise exception.PortBindingFailed(port_id=port['id']) [ 740.007943] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] nova.exception.PortBindingFailed: Binding failed for port 658e7dd4-2de9-447d-8402-096e9544e744, please check neutron logs for more information. [ 740.007943] env[60164]: ERROR nova.compute.manager [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] [ 740.007943] env[60164]: DEBUG nova.compute.utils [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] Binding failed for port 658e7dd4-2de9-447d-8402-096e9544e744, please check neutron logs for more information. {{(pid=60164) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 740.008246] env[60164]: DEBUG nova.compute.manager [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] Build of instance bd447698-8d52-4576-9d86-1a22e36bc3d5 was re-scheduled: Binding failed for port 658e7dd4-2de9-447d-8402-096e9544e744, please check neutron logs for more information. {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 740.008246] env[60164]: DEBUG nova.compute.manager [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] Unplugging VIFs for instance {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} [ 740.008246] env[60164]: DEBUG oslo_concurrency.lockutils [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] Acquiring lock "refresh_cache-bd447698-8d52-4576-9d86-1a22e36bc3d5" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 740.008364] env[60164]: DEBUG oslo_concurrency.lockutils [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] Acquired lock "refresh_cache-bd447698-8d52-4576-9d86-1a22e36bc3d5" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 740.008452] env[60164]: DEBUG nova.network.neutron [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 740.342266] env[60164]: DEBUG nova.network.neutron [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 740.446837] env[60164]: DEBUG nova.network.neutron [req-3bdfab69-1efd-4be5-90fc-633b961f2cb6 req-9f2fd91c-2466-42f0-8628-9e42a8208498 service nova] [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 740.456590] env[60164]: DEBUG oslo_concurrency.lockutils [req-3bdfab69-1efd-4be5-90fc-633b961f2cb6 req-9f2fd91c-2466-42f0-8628-9e42a8208498 service nova] Releasing lock "refresh_cache-ab6859e4-807d-4b5f-943b-6491ed211c75" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 740.456799] env[60164]: DEBUG nova.compute.manager [req-3bdfab69-1efd-4be5-90fc-633b961f2cb6 req-9f2fd91c-2466-42f0-8628-9e42a8208498 service nova] [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] Received event network-changed-658e7dd4-2de9-447d-8402-096e9544e744 {{(pid=60164) external_instance_event /opt/stack/nova/nova/compute/manager.py:10979}} [ 740.456966] env[60164]: DEBUG nova.compute.manager [req-3bdfab69-1efd-4be5-90fc-633b961f2cb6 req-9f2fd91c-2466-42f0-8628-9e42a8208498 service nova] [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] Refreshing instance network info cache due to event network-changed-658e7dd4-2de9-447d-8402-096e9544e744. {{(pid=60164) external_instance_event /opt/stack/nova/nova/compute/manager.py:10984}} [ 740.457311] env[60164]: DEBUG oslo_concurrency.lockutils [req-3bdfab69-1efd-4be5-90fc-633b961f2cb6 req-9f2fd91c-2466-42f0-8628-9e42a8208498 service nova] Acquiring lock "refresh_cache-bd447698-8d52-4576-9d86-1a22e36bc3d5" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 740.457381] env[60164]: DEBUG oslo_concurrency.lockutils [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] Acquired lock "refresh_cache-ab6859e4-807d-4b5f-943b-6491ed211c75" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 740.457692] env[60164]: DEBUG nova.network.neutron [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 740.586103] env[60164]: DEBUG nova.network.neutron [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 741.358253] env[60164]: DEBUG nova.network.neutron [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 741.368686] env[60164]: DEBUG oslo_concurrency.lockutils [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] Releasing lock "refresh_cache-bd447698-8d52-4576-9d86-1a22e36bc3d5" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 741.369053] env[60164]: DEBUG nova.compute.manager [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2984}} [ 741.369319] env[60164]: DEBUG nova.compute.manager [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 741.369382] env[60164]: DEBUG nova.network.neutron [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 741.371438] env[60164]: DEBUG oslo_concurrency.lockutils [req-3bdfab69-1efd-4be5-90fc-633b961f2cb6 req-9f2fd91c-2466-42f0-8628-9e42a8208498 service nova] Acquired lock "refresh_cache-bd447698-8d52-4576-9d86-1a22e36bc3d5" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 741.371619] env[60164]: DEBUG nova.network.neutron [req-3bdfab69-1efd-4be5-90fc-633b961f2cb6 req-9f2fd91c-2466-42f0-8628-9e42a8208498 service nova] [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] Refreshing network info cache for port 658e7dd4-2de9-447d-8402-096e9544e744 {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1986}} [ 741.491473] env[60164]: DEBUG nova.network.neutron [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 741.497349] env[60164]: DEBUG nova.network.neutron [req-3bdfab69-1efd-4be5-90fc-633b961f2cb6 req-9f2fd91c-2466-42f0-8628-9e42a8208498 service nova] [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 741.500197] env[60164]: DEBUG nova.network.neutron [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 741.508823] env[60164]: INFO nova.compute.manager [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] Took 0.14 seconds to deallocate network for instance. [ 741.583587] env[60164]: DEBUG nova.network.neutron [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 741.600674] env[60164]: INFO nova.scheduler.client.report [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] Deleted allocations for instance bd447698-8d52-4576-9d86-1a22e36bc3d5 [ 741.606011] env[60164]: DEBUG oslo_concurrency.lockutils [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] Releasing lock "refresh_cache-ab6859e4-807d-4b5f-943b-6491ed211c75" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 741.606242] env[60164]: DEBUG nova.compute.manager [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2984}} [ 741.606417] env[60164]: DEBUG nova.compute.manager [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 741.606573] env[60164]: DEBUG nova.network.neutron [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 741.618681] env[60164]: DEBUG oslo_concurrency.lockutils [None req-37b5320c-8cb9-46d1-9324-af075ff3ca1d tempest-SecurityGroupsTestJSON-1433999459 tempest-SecurityGroupsTestJSON-1433999459-project-member] Lock "bd447698-8d52-4576-9d86-1a22e36bc3d5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 12.942s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 741.715553] env[60164]: DEBUG nova.network.neutron [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 741.723617] env[60164]: DEBUG nova.network.neutron [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 741.733811] env[60164]: INFO nova.compute.manager [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] Took 0.13 seconds to deallocate network for instance. [ 741.849132] env[60164]: INFO nova.scheduler.client.report [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] Deleted allocations for instance ab6859e4-807d-4b5f-943b-6491ed211c75 [ 741.868752] env[60164]: DEBUG oslo_concurrency.lockutils [None req-84e10c45-1787-4e24-9273-ffcb4e8fc8ea tempest-ServerActionsTestOtherA-1143455132 tempest-ServerActionsTestOtherA-1143455132-project-member] Lock "ab6859e4-807d-4b5f-943b-6491ed211c75" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 18.292s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 742.165225] env[60164]: DEBUG nova.network.neutron [req-3bdfab69-1efd-4be5-90fc-633b961f2cb6 req-9f2fd91c-2466-42f0-8628-9e42a8208498 service nova] [instance: bd447698-8d52-4576-9d86-1a22e36bc3d5] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 742.176266] env[60164]: DEBUG oslo_concurrency.lockutils [req-3bdfab69-1efd-4be5-90fc-633b961f2cb6 req-9f2fd91c-2466-42f0-8628-9e42a8208498 service nova] Releasing lock "refresh_cache-bd447698-8d52-4576-9d86-1a22e36bc3d5" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 742.176515] env[60164]: DEBUG nova.compute.manager [req-3bdfab69-1efd-4be5-90fc-633b961f2cb6 req-9f2fd91c-2466-42f0-8628-9e42a8208498 service nova] [instance: ab6859e4-807d-4b5f-943b-6491ed211c75] Received event network-vif-deleted-13ca33c8-9dde-4869-a11f-1bd3910b59be {{(pid=60164) external_instance_event /opt/stack/nova/nova/compute/manager.py:10979}} [ 742.848435] env[60164]: ERROR nova.compute.manager [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 97ab4f54-0a33-4866-8e90-3302cbbff541, please check neutron logs for more information. [ 742.848435] env[60164]: ERROR nova.compute.manager Traceback (most recent call last): [ 742.848435] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 742.848435] env[60164]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 742.848435] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 742.848435] env[60164]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 742.848435] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 742.848435] env[60164]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 742.848435] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 742.848435] env[60164]: ERROR nova.compute.manager self.force_reraise() [ 742.848435] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 742.848435] env[60164]: ERROR nova.compute.manager raise self.value [ 742.848435] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 742.848435] env[60164]: ERROR nova.compute.manager updated_port = self._update_port( [ 742.848435] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 742.848435] env[60164]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 742.849290] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 742.849290] env[60164]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 742.849290] env[60164]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 97ab4f54-0a33-4866-8e90-3302cbbff541, please check neutron logs for more information. [ 742.849290] env[60164]: ERROR nova.compute.manager [ 742.849290] env[60164]: Traceback (most recent call last): [ 742.849290] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 742.849290] env[60164]: listener.cb(fileno) [ 742.849290] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 742.849290] env[60164]: result = function(*args, **kwargs) [ 742.849290] env[60164]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 742.849290] env[60164]: return func(*args, **kwargs) [ 742.849290] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 742.849290] env[60164]: raise e [ 742.849290] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 742.849290] env[60164]: nwinfo = self.network_api.allocate_for_instance( [ 742.849290] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 742.849290] env[60164]: created_port_ids = self._update_ports_for_instance( [ 742.849290] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 742.849290] env[60164]: with excutils.save_and_reraise_exception(): [ 742.849290] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 742.849290] env[60164]: self.force_reraise() [ 742.849290] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 742.849290] env[60164]: raise self.value [ 742.849290] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 742.849290] env[60164]: updated_port = self._update_port( [ 742.849290] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 742.849290] env[60164]: _ensure_no_port_binding_failure(port) [ 742.849290] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 742.849290] env[60164]: raise exception.PortBindingFailed(port_id=port['id']) [ 742.850070] env[60164]: nova.exception.PortBindingFailed: Binding failed for port 97ab4f54-0a33-4866-8e90-3302cbbff541, please check neutron logs for more information. [ 742.850070] env[60164]: Removing descriptor: 14 [ 742.850070] env[60164]: ERROR nova.compute.manager [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 97ab4f54-0a33-4866-8e90-3302cbbff541, please check neutron logs for more information. [ 742.850070] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] Traceback (most recent call last): [ 742.850070] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] File "/opt/stack/nova/nova/compute/manager.py", line 2849, in _build_resources [ 742.850070] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] yield resources [ 742.850070] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 742.850070] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] self.driver.spawn(context, instance, image_meta, [ 742.850070] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 742.850070] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] self._vmops.spawn(context, instance, image_meta, injected_files, [ 742.850070] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 742.850070] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] vm_ref = self.build_virtual_machine(instance, [ 742.850393] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 742.850393] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] vif_infos = vmwarevif.get_vif_info(self._session, [ 742.850393] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 742.850393] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] for vif in network_info: [ 742.850393] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 742.850393] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] return self._sync_wrapper(fn, *args, **kwargs) [ 742.850393] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 742.850393] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] self.wait() [ 742.850393] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 742.850393] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] self[:] = self._gt.wait() [ 742.850393] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 742.850393] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] return self._exit_event.wait() [ 742.850393] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 742.850393] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] result = hub.switch() [ 742.850825] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 742.850825] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] return self.greenlet.switch() [ 742.850825] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 742.850825] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] result = function(*args, **kwargs) [ 742.850825] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 742.850825] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] return func(*args, **kwargs) [ 742.850825] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 742.850825] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] raise e [ 742.850825] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 742.850825] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] nwinfo = self.network_api.allocate_for_instance( [ 742.850825] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 742.850825] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] created_port_ids = self._update_ports_for_instance( [ 742.850825] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 742.851246] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] with excutils.save_and_reraise_exception(): [ 742.851246] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 742.851246] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] self.force_reraise() [ 742.851246] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 742.851246] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] raise self.value [ 742.851246] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 742.851246] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] updated_port = self._update_port( [ 742.851246] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 742.851246] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] _ensure_no_port_binding_failure(port) [ 742.851246] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 742.851246] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] raise exception.PortBindingFailed(port_id=port['id']) [ 742.851246] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] nova.exception.PortBindingFailed: Binding failed for port 97ab4f54-0a33-4866-8e90-3302cbbff541, please check neutron logs for more information. [ 742.851246] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] [ 742.851580] env[60164]: INFO nova.compute.manager [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] Terminating instance [ 742.853687] env[60164]: DEBUG oslo_concurrency.lockutils [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] Acquiring lock "refresh_cache-35cae673-166d-4ffc-90fb-aee3bdfd1710" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 742.853883] env[60164]: DEBUG oslo_concurrency.lockutils [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] Acquired lock "refresh_cache-35cae673-166d-4ffc-90fb-aee3bdfd1710" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 742.854369] env[60164]: DEBUG nova.network.neutron [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 742.895730] env[60164]: DEBUG nova.network.neutron [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 743.222110] env[60164]: DEBUG nova.network.neutron [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 743.235268] env[60164]: DEBUG oslo_concurrency.lockutils [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] Releasing lock "refresh_cache-35cae673-166d-4ffc-90fb-aee3bdfd1710" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 743.235268] env[60164]: DEBUG nova.compute.manager [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] Start destroying the instance on the hypervisor. {{(pid=60164) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} [ 743.235925] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] Destroying instance {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 743.237053] env[60164]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-092c2c26-3675-4312-8951-28be7ce9d68f {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.250202] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f774e986-b77f-46dd-aa6f-42e13b11d035 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.276582] env[60164]: WARNING nova.virt.vmwareapi.vmops [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 35cae673-166d-4ffc-90fb-aee3bdfd1710 could not be found. [ 743.276854] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] Instance destroyed {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 743.277100] env[60164]: INFO nova.compute.manager [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] Took 0.04 seconds to destroy the instance on the hypervisor. [ 743.277385] env[60164]: DEBUG oslo.service.loopingcall [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60164) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 743.277684] env[60164]: DEBUG nova.compute.manager [-] [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 743.277823] env[60164]: DEBUG nova.network.neutron [-] [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 743.515478] env[60164]: DEBUG nova.network.neutron [-] [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 743.528835] env[60164]: DEBUG nova.network.neutron [-] [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 743.537856] env[60164]: INFO nova.compute.manager [-] [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] Took 0.26 seconds to deallocate network for instance. [ 743.540230] env[60164]: DEBUG nova.compute.claims [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] Aborting claim: {{(pid=60164) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 743.540400] env[60164]: DEBUG oslo_concurrency.lockutils [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 743.540635] env[60164]: DEBUG oslo_concurrency.lockutils [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 743.717334] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc8ba9b8-3943-4add-a69f-6823a80f00c4 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.726558] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4239de24-fd13-426a-adb5-379fed0a7a2a {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.764350] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f50808dc-69ce-4fcd-b247-9e1c80af7a14 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.768432] env[60164]: DEBUG nova.compute.manager [req-b1b0f5f6-4446-4c43-abeb-17bef2ab55fb req-2f3a2a27-d74d-45f2-a5df-6fb6e106ae83 service nova] [instance: 73b58043-e025-48ff-a22a-4d226c545456] Received event network-changed-287f3ea7-5f8f-462d-b0a4-2bb5a46b7eed {{(pid=60164) external_instance_event /opt/stack/nova/nova/compute/manager.py:10979}} [ 743.768689] env[60164]: DEBUG nova.compute.manager [req-b1b0f5f6-4446-4c43-abeb-17bef2ab55fb req-2f3a2a27-d74d-45f2-a5df-6fb6e106ae83 service nova] [instance: 73b58043-e025-48ff-a22a-4d226c545456] Refreshing instance network info cache due to event network-changed-287f3ea7-5f8f-462d-b0a4-2bb5a46b7eed. {{(pid=60164) external_instance_event /opt/stack/nova/nova/compute/manager.py:10984}} [ 743.768977] env[60164]: DEBUG oslo_concurrency.lockutils [req-b1b0f5f6-4446-4c43-abeb-17bef2ab55fb req-2f3a2a27-d74d-45f2-a5df-6fb6e106ae83 service nova] Acquiring lock "refresh_cache-73b58043-e025-48ff-a22a-4d226c545456" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 743.769186] env[60164]: DEBUG oslo_concurrency.lockutils [req-b1b0f5f6-4446-4c43-abeb-17bef2ab55fb req-2f3a2a27-d74d-45f2-a5df-6fb6e106ae83 service nova] Acquired lock "refresh_cache-73b58043-e025-48ff-a22a-4d226c545456" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 743.769390] env[60164]: DEBUG nova.network.neutron [req-b1b0f5f6-4446-4c43-abeb-17bef2ab55fb req-2f3a2a27-d74d-45f2-a5df-6fb6e106ae83 service nova] [instance: 73b58043-e025-48ff-a22a-4d226c545456] Refreshing network info cache for port 287f3ea7-5f8f-462d-b0a4-2bb5a46b7eed {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1986}} [ 743.778858] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6bfb3c5a-4a42-43a1-8237-7d97322fc2aa {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.795853] env[60164]: DEBUG nova.compute.provider_tree [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 743.806946] env[60164]: DEBUG nova.scheduler.client.report [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 743.824412] env[60164]: ERROR nova.compute.manager [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 287f3ea7-5f8f-462d-b0a4-2bb5a46b7eed, please check neutron logs for more information. [ 743.824412] env[60164]: ERROR nova.compute.manager Traceback (most recent call last): [ 743.824412] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 743.824412] env[60164]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 743.824412] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 743.824412] env[60164]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 743.824412] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 743.824412] env[60164]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 743.824412] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 743.824412] env[60164]: ERROR nova.compute.manager self.force_reraise() [ 743.824412] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 743.824412] env[60164]: ERROR nova.compute.manager raise self.value [ 743.824412] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 743.824412] env[60164]: ERROR nova.compute.manager updated_port = self._update_port( [ 743.824412] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 743.824412] env[60164]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 743.824903] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 743.824903] env[60164]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 743.824903] env[60164]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 287f3ea7-5f8f-462d-b0a4-2bb5a46b7eed, please check neutron logs for more information. [ 743.824903] env[60164]: ERROR nova.compute.manager [ 743.824903] env[60164]: Traceback (most recent call last): [ 743.824903] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 743.824903] env[60164]: listener.cb(fileno) [ 743.824903] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 743.824903] env[60164]: result = function(*args, **kwargs) [ 743.824903] env[60164]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 743.824903] env[60164]: return func(*args, **kwargs) [ 743.824903] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 743.824903] env[60164]: raise e [ 743.824903] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 743.824903] env[60164]: nwinfo = self.network_api.allocate_for_instance( [ 743.824903] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 743.824903] env[60164]: created_port_ids = self._update_ports_for_instance( [ 743.824903] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 743.824903] env[60164]: with excutils.save_and_reraise_exception(): [ 743.824903] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 743.824903] env[60164]: self.force_reraise() [ 743.824903] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 743.824903] env[60164]: raise self.value [ 743.824903] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 743.824903] env[60164]: updated_port = self._update_port( [ 743.824903] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 743.824903] env[60164]: _ensure_no_port_binding_failure(port) [ 743.824903] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 743.824903] env[60164]: raise exception.PortBindingFailed(port_id=port['id']) [ 743.825657] env[60164]: nova.exception.PortBindingFailed: Binding failed for port 287f3ea7-5f8f-462d-b0a4-2bb5a46b7eed, please check neutron logs for more information. [ 743.825657] env[60164]: Removing descriptor: 17 [ 743.825657] env[60164]: ERROR nova.compute.manager [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 73b58043-e025-48ff-a22a-4d226c545456] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 287f3ea7-5f8f-462d-b0a4-2bb5a46b7eed, please check neutron logs for more information. [ 743.825657] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] Traceback (most recent call last): [ 743.825657] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] File "/opt/stack/nova/nova/compute/manager.py", line 2849, in _build_resources [ 743.825657] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] yield resources [ 743.825657] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 743.825657] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] self.driver.spawn(context, instance, image_meta, [ 743.825657] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 743.825657] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] self._vmops.spawn(context, instance, image_meta, injected_files, [ 743.825657] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 743.825657] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] vm_ref = self.build_virtual_machine(instance, [ 743.825978] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 743.825978] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] vif_infos = vmwarevif.get_vif_info(self._session, [ 743.825978] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 743.825978] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] for vif in network_info: [ 743.825978] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 743.825978] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] return self._sync_wrapper(fn, *args, **kwargs) [ 743.825978] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 743.825978] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] self.wait() [ 743.825978] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 743.825978] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] self[:] = self._gt.wait() [ 743.825978] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 743.825978] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] return self._exit_event.wait() [ 743.825978] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 743.825978] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] result = hub.switch() [ 743.826342] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 743.826342] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] return self.greenlet.switch() [ 743.826342] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 743.826342] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] result = function(*args, **kwargs) [ 743.826342] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 743.826342] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] return func(*args, **kwargs) [ 743.826342] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 743.826342] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] raise e [ 743.826342] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 743.826342] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] nwinfo = self.network_api.allocate_for_instance( [ 743.826342] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 743.826342] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] created_port_ids = self._update_ports_for_instance( [ 743.826342] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 743.826734] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] with excutils.save_and_reraise_exception(): [ 743.826734] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 743.826734] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] self.force_reraise() [ 743.826734] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 743.826734] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] raise self.value [ 743.826734] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 743.826734] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] updated_port = self._update_port( [ 743.826734] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 743.826734] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] _ensure_no_port_binding_failure(port) [ 743.826734] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 743.826734] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] raise exception.PortBindingFailed(port_id=port['id']) [ 743.826734] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] nova.exception.PortBindingFailed: Binding failed for port 287f3ea7-5f8f-462d-b0a4-2bb5a46b7eed, please check neutron logs for more information. [ 743.826734] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] [ 743.827166] env[60164]: INFO nova.compute.manager [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 73b58043-e025-48ff-a22a-4d226c545456] Terminating instance [ 743.832113] env[60164]: DEBUG oslo_concurrency.lockutils [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Acquiring lock "refresh_cache-73b58043-e025-48ff-a22a-4d226c545456" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 743.834357] env[60164]: DEBUG oslo_concurrency.lockutils [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.294s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 743.834995] env[60164]: ERROR nova.compute.manager [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 97ab4f54-0a33-4866-8e90-3302cbbff541, please check neutron logs for more information. [ 743.834995] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] Traceback (most recent call last): [ 743.834995] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 743.834995] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] self.driver.spawn(context, instance, image_meta, [ 743.834995] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 743.834995] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] self._vmops.spawn(context, instance, image_meta, injected_files, [ 743.834995] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 743.834995] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] vm_ref = self.build_virtual_machine(instance, [ 743.834995] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 743.834995] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] vif_infos = vmwarevif.get_vif_info(self._session, [ 743.834995] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 743.835329] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] for vif in network_info: [ 743.835329] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 743.835329] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] return self._sync_wrapper(fn, *args, **kwargs) [ 743.835329] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 743.835329] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] self.wait() [ 743.835329] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 743.835329] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] self[:] = self._gt.wait() [ 743.835329] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 743.835329] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] return self._exit_event.wait() [ 743.835329] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 743.835329] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] result = hub.switch() [ 743.835329] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 743.835329] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] return self.greenlet.switch() [ 743.835329] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 743.835724] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] result = function(*args, **kwargs) [ 743.835724] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 743.835724] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] return func(*args, **kwargs) [ 743.835724] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 743.835724] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] raise e [ 743.835724] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 743.835724] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] nwinfo = self.network_api.allocate_for_instance( [ 743.835724] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 743.835724] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] created_port_ids = self._update_ports_for_instance( [ 743.835724] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 743.835724] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] with excutils.save_and_reraise_exception(): [ 743.835724] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 743.835724] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] self.force_reraise() [ 743.836072] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 743.836072] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] raise self.value [ 743.836072] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 743.836072] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] updated_port = self._update_port( [ 743.836072] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 743.836072] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] _ensure_no_port_binding_failure(port) [ 743.836072] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 743.836072] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] raise exception.PortBindingFailed(port_id=port['id']) [ 743.836072] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] nova.exception.PortBindingFailed: Binding failed for port 97ab4f54-0a33-4866-8e90-3302cbbff541, please check neutron logs for more information. [ 743.836072] env[60164]: ERROR nova.compute.manager [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] [ 743.836072] env[60164]: DEBUG nova.compute.utils [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] Binding failed for port 97ab4f54-0a33-4866-8e90-3302cbbff541, please check neutron logs for more information. {{(pid=60164) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 743.838493] env[60164]: DEBUG nova.network.neutron [req-b1b0f5f6-4446-4c43-abeb-17bef2ab55fb req-2f3a2a27-d74d-45f2-a5df-6fb6e106ae83 service nova] [instance: 73b58043-e025-48ff-a22a-4d226c545456] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 743.845281] env[60164]: DEBUG nova.compute.manager [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] Build of instance 35cae673-166d-4ffc-90fb-aee3bdfd1710 was re-scheduled: Binding failed for port 97ab4f54-0a33-4866-8e90-3302cbbff541, please check neutron logs for more information. {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 743.845968] env[60164]: DEBUG nova.compute.manager [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] Unplugging VIFs for instance {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} [ 743.845968] env[60164]: DEBUG oslo_concurrency.lockutils [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] Acquiring lock "refresh_cache-35cae673-166d-4ffc-90fb-aee3bdfd1710" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 743.846117] env[60164]: DEBUG oslo_concurrency.lockutils [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] Acquired lock "refresh_cache-35cae673-166d-4ffc-90fb-aee3bdfd1710" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 743.846227] env[60164]: DEBUG nova.network.neutron [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 744.193860] env[60164]: DEBUG nova.network.neutron [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 744.570762] env[60164]: DEBUG nova.network.neutron [req-b1b0f5f6-4446-4c43-abeb-17bef2ab55fb req-2f3a2a27-d74d-45f2-a5df-6fb6e106ae83 service nova] [instance: 73b58043-e025-48ff-a22a-4d226c545456] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 744.585336] env[60164]: DEBUG oslo_concurrency.lockutils [req-b1b0f5f6-4446-4c43-abeb-17bef2ab55fb req-2f3a2a27-d74d-45f2-a5df-6fb6e106ae83 service nova] Releasing lock "refresh_cache-73b58043-e025-48ff-a22a-4d226c545456" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 744.587115] env[60164]: DEBUG oslo_concurrency.lockutils [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Acquired lock "refresh_cache-73b58043-e025-48ff-a22a-4d226c545456" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 744.587371] env[60164]: DEBUG nova.network.neutron [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 73b58043-e025-48ff-a22a-4d226c545456] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 744.639329] env[60164]: DEBUG nova.network.neutron [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 744.650020] env[60164]: DEBUG oslo_concurrency.lockutils [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] Releasing lock "refresh_cache-35cae673-166d-4ffc-90fb-aee3bdfd1710" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 744.650273] env[60164]: DEBUG nova.compute.manager [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2984}} [ 744.650452] env[60164]: DEBUG nova.compute.manager [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 744.650942] env[60164]: DEBUG nova.network.neutron [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 744.694338] env[60164]: DEBUG nova.network.neutron [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 744.702262] env[60164]: DEBUG nova.network.neutron [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 744.707738] env[60164]: DEBUG nova.network.neutron [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 73b58043-e025-48ff-a22a-4d226c545456] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 744.729434] env[60164]: INFO nova.compute.manager [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] [instance: 35cae673-166d-4ffc-90fb-aee3bdfd1710] Took 0.08 seconds to deallocate network for instance. [ 744.739047] env[60164]: ERROR nova.compute.manager [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 521c92d4-0858-4e8f-a8a6-4774cc3624ae, please check neutron logs for more information. [ 744.739047] env[60164]: ERROR nova.compute.manager Traceback (most recent call last): [ 744.739047] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 744.739047] env[60164]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 744.739047] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 744.739047] env[60164]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 744.739047] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 744.739047] env[60164]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 744.739047] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 744.739047] env[60164]: ERROR nova.compute.manager self.force_reraise() [ 744.739047] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 744.739047] env[60164]: ERROR nova.compute.manager raise self.value [ 744.739047] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 744.739047] env[60164]: ERROR nova.compute.manager updated_port = self._update_port( [ 744.739047] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 744.739047] env[60164]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 744.739650] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 744.739650] env[60164]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 744.739650] env[60164]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 521c92d4-0858-4e8f-a8a6-4774cc3624ae, please check neutron logs for more information. [ 744.739650] env[60164]: ERROR nova.compute.manager [ 744.739650] env[60164]: Traceback (most recent call last): [ 744.739650] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 744.739650] env[60164]: listener.cb(fileno) [ 744.739650] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 744.739650] env[60164]: result = function(*args, **kwargs) [ 744.739650] env[60164]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 744.739650] env[60164]: return func(*args, **kwargs) [ 744.739650] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 744.739650] env[60164]: raise e [ 744.739650] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 744.739650] env[60164]: nwinfo = self.network_api.allocate_for_instance( [ 744.739650] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 744.739650] env[60164]: created_port_ids = self._update_ports_for_instance( [ 744.739650] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 744.739650] env[60164]: with excutils.save_and_reraise_exception(): [ 744.739650] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 744.739650] env[60164]: self.force_reraise() [ 744.739650] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 744.739650] env[60164]: raise self.value [ 744.739650] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 744.739650] env[60164]: updated_port = self._update_port( [ 744.739650] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 744.739650] env[60164]: _ensure_no_port_binding_failure(port) [ 744.739650] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 744.739650] env[60164]: raise exception.PortBindingFailed(port_id=port['id']) [ 744.740505] env[60164]: nova.exception.PortBindingFailed: Binding failed for port 521c92d4-0858-4e8f-a8a6-4774cc3624ae, please check neutron logs for more information. [ 744.740505] env[60164]: Removing descriptor: 18 [ 744.740505] env[60164]: ERROR nova.compute.manager [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 521c92d4-0858-4e8f-a8a6-4774cc3624ae, please check neutron logs for more information. [ 744.740505] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] Traceback (most recent call last): [ 744.740505] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] File "/opt/stack/nova/nova/compute/manager.py", line 2849, in _build_resources [ 744.740505] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] yield resources [ 744.740505] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 744.740505] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] self.driver.spawn(context, instance, image_meta, [ 744.740505] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 744.740505] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] self._vmops.spawn(context, instance, image_meta, injected_files, [ 744.740505] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 744.740505] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] vm_ref = self.build_virtual_machine(instance, [ 744.740872] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 744.740872] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] vif_infos = vmwarevif.get_vif_info(self._session, [ 744.740872] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 744.740872] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] for vif in network_info: [ 744.740872] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 744.740872] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] return self._sync_wrapper(fn, *args, **kwargs) [ 744.740872] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 744.740872] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] self.wait() [ 744.740872] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 744.740872] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] self[:] = self._gt.wait() [ 744.740872] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 744.740872] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] return self._exit_event.wait() [ 744.740872] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 744.740872] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] result = hub.switch() [ 744.741316] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 744.741316] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] return self.greenlet.switch() [ 744.741316] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 744.741316] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] result = function(*args, **kwargs) [ 744.741316] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 744.741316] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] return func(*args, **kwargs) [ 744.741316] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 744.741316] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] raise e [ 744.741316] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 744.741316] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] nwinfo = self.network_api.allocate_for_instance( [ 744.741316] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 744.741316] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] created_port_ids = self._update_ports_for_instance( [ 744.741316] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 744.741687] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] with excutils.save_and_reraise_exception(): [ 744.741687] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 744.741687] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] self.force_reraise() [ 744.741687] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 744.741687] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] raise self.value [ 744.741687] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 744.741687] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] updated_port = self._update_port( [ 744.741687] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 744.741687] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] _ensure_no_port_binding_failure(port) [ 744.741687] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 744.741687] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] raise exception.PortBindingFailed(port_id=port['id']) [ 744.741687] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] nova.exception.PortBindingFailed: Binding failed for port 521c92d4-0858-4e8f-a8a6-4774cc3624ae, please check neutron logs for more information. [ 744.741687] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] [ 744.743358] env[60164]: INFO nova.compute.manager [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] Terminating instance [ 744.745589] env[60164]: DEBUG oslo_concurrency.lockutils [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Acquiring lock "refresh_cache-031e33fa-92ab-483b-ab38-ecf3bbfd1374" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 744.745752] env[60164]: DEBUG oslo_concurrency.lockutils [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Acquired lock "refresh_cache-031e33fa-92ab-483b-ab38-ecf3bbfd1374" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 744.745916] env[60164]: DEBUG nova.network.neutron [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 744.806127] env[60164]: DEBUG nova.network.neutron [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 744.857792] env[60164]: INFO nova.scheduler.client.report [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] Deleted allocations for instance 35cae673-166d-4ffc-90fb-aee3bdfd1710 [ 744.883038] env[60164]: DEBUG oslo_concurrency.lockutils [None req-4b034ead-b341-4094-8a86-1293a94d7995 tempest-ImagesOneServerTestJSON-356266125 tempest-ImagesOneServerTestJSON-356266125-project-member] Lock "35cae673-166d-4ffc-90fb-aee3bdfd1710" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 14.359s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 745.163117] env[60164]: DEBUG nova.network.neutron [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 745.173528] env[60164]: DEBUG oslo_concurrency.lockutils [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Releasing lock "refresh_cache-031e33fa-92ab-483b-ab38-ecf3bbfd1374" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 745.173998] env[60164]: DEBUG nova.compute.manager [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] Start destroying the instance on the hypervisor. {{(pid=60164) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} [ 745.174216] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] Destroying instance {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 745.175277] env[60164]: DEBUG nova.network.neutron [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 73b58043-e025-48ff-a22a-4d226c545456] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 745.177173] env[60164]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-4bd48300-f848-40b5-b9a2-ebe4f955e02a {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 745.186234] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-054d0095-0536-48ea-a4d3-f3a936706818 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 745.202030] env[60164]: DEBUG oslo_concurrency.lockutils [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Releasing lock "refresh_cache-73b58043-e025-48ff-a22a-4d226c545456" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 745.202605] env[60164]: DEBUG nova.compute.manager [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 73b58043-e025-48ff-a22a-4d226c545456] Start destroying the instance on the hypervisor. {{(pid=60164) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} [ 745.202870] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 73b58043-e025-48ff-a22a-4d226c545456] Destroying instance {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 745.205409] env[60164]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-32a09cc7-16bf-4646-83d1-7b342ae167fb {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 745.220841] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12cd145b-b7e5-4dd0-b696-1f47ad63712b {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 745.234187] env[60164]: WARNING nova.virt.vmwareapi.vmops [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 031e33fa-92ab-483b-ab38-ecf3bbfd1374 could not be found. [ 745.234917] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] Instance destroyed {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 745.234917] env[60164]: INFO nova.compute.manager [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] Took 0.06 seconds to destroy the instance on the hypervisor. [ 745.235049] env[60164]: DEBUG oslo.service.loopingcall [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60164) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 745.235245] env[60164]: DEBUG nova.compute.manager [-] [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 745.235335] env[60164]: DEBUG nova.network.neutron [-] [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 745.250448] env[60164]: WARNING nova.virt.vmwareapi.vmops [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 73b58043-e025-48ff-a22a-4d226c545456] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 73b58043-e025-48ff-a22a-4d226c545456 could not be found. [ 745.250702] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 73b58043-e025-48ff-a22a-4d226c545456] Instance destroyed {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 745.252704] env[60164]: INFO nova.compute.manager [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 73b58043-e025-48ff-a22a-4d226c545456] Took 0.05 seconds to destroy the instance on the hypervisor. [ 745.252704] env[60164]: DEBUG oslo.service.loopingcall [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60164) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 745.252704] env[60164]: DEBUG nova.compute.manager [-] [instance: 73b58043-e025-48ff-a22a-4d226c545456] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 745.252704] env[60164]: DEBUG nova.network.neutron [-] [instance: 73b58043-e025-48ff-a22a-4d226c545456] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 745.294988] env[60164]: DEBUG nova.network.neutron [-] [instance: 73b58043-e025-48ff-a22a-4d226c545456] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 745.306016] env[60164]: DEBUG nova.network.neutron [-] [instance: 73b58043-e025-48ff-a22a-4d226c545456] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 745.313366] env[60164]: INFO nova.compute.manager [-] [instance: 73b58043-e025-48ff-a22a-4d226c545456] Took 0.06 seconds to deallocate network for instance. [ 745.315857] env[60164]: DEBUG nova.compute.claims [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 73b58043-e025-48ff-a22a-4d226c545456] Aborting claim: {{(pid=60164) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 745.319481] env[60164]: DEBUG oslo_concurrency.lockutils [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 745.319481] env[60164]: DEBUG oslo_concurrency.lockutils [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 745.322944] env[60164]: DEBUG nova.network.neutron [-] [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 745.341239] env[60164]: DEBUG nova.network.neutron [-] [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 745.362715] env[60164]: INFO nova.compute.manager [-] [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] Took 0.13 seconds to deallocate network for instance. [ 745.363373] env[60164]: DEBUG nova.compute.claims [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] Aborting claim: {{(pid=60164) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 745.363701] env[60164]: DEBUG oslo_concurrency.lockutils [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 745.527777] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e00a4bf-ca46-4860-ba7d-8aeaa24fc796 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 745.536834] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3486e27-e911-4131-8f2c-a1995238c69f {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 745.575029] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c39c52b-78dd-4d8d-b574-ff299117157b {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 745.583489] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91e0050a-a754-4eb1-b1a9-50ebd07acc90 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 745.603383] env[60164]: DEBUG nova.compute.provider_tree [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 745.614592] env[60164]: DEBUG nova.scheduler.client.report [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 745.645709] env[60164]: DEBUG oslo_concurrency.lockutils [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.327s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 745.646096] env[60164]: ERROR nova.compute.manager [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 73b58043-e025-48ff-a22a-4d226c545456] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 287f3ea7-5f8f-462d-b0a4-2bb5a46b7eed, please check neutron logs for more information. [ 745.646096] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] Traceback (most recent call last): [ 745.646096] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 745.646096] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] self.driver.spawn(context, instance, image_meta, [ 745.646096] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 745.646096] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] self._vmops.spawn(context, instance, image_meta, injected_files, [ 745.646096] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 745.646096] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] vm_ref = self.build_virtual_machine(instance, [ 745.646096] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 745.646096] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] vif_infos = vmwarevif.get_vif_info(self._session, [ 745.646096] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 745.646442] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] for vif in network_info: [ 745.646442] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 745.646442] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] return self._sync_wrapper(fn, *args, **kwargs) [ 745.646442] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 745.646442] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] self.wait() [ 745.646442] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 745.646442] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] self[:] = self._gt.wait() [ 745.646442] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 745.646442] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] return self._exit_event.wait() [ 745.646442] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 745.646442] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] result = hub.switch() [ 745.646442] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 745.646442] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] return self.greenlet.switch() [ 745.646442] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 745.646857] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] result = function(*args, **kwargs) [ 745.646857] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 745.646857] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] return func(*args, **kwargs) [ 745.646857] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 745.646857] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] raise e [ 745.646857] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 745.646857] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] nwinfo = self.network_api.allocate_for_instance( [ 745.646857] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 745.646857] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] created_port_ids = self._update_ports_for_instance( [ 745.646857] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 745.646857] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] with excutils.save_and_reraise_exception(): [ 745.646857] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 745.646857] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] self.force_reraise() [ 745.647237] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 745.647237] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] raise self.value [ 745.647237] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 745.647237] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] updated_port = self._update_port( [ 745.647237] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 745.647237] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] _ensure_no_port_binding_failure(port) [ 745.647237] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 745.647237] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] raise exception.PortBindingFailed(port_id=port['id']) [ 745.647237] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] nova.exception.PortBindingFailed: Binding failed for port 287f3ea7-5f8f-462d-b0a4-2bb5a46b7eed, please check neutron logs for more information. [ 745.647237] env[60164]: ERROR nova.compute.manager [instance: 73b58043-e025-48ff-a22a-4d226c545456] [ 745.647237] env[60164]: DEBUG nova.compute.utils [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 73b58043-e025-48ff-a22a-4d226c545456] Binding failed for port 287f3ea7-5f8f-462d-b0a4-2bb5a46b7eed, please check neutron logs for more information. {{(pid=60164) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 745.652543] env[60164]: DEBUG oslo_concurrency.lockutils [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.289s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 745.657066] env[60164]: DEBUG nova.compute.manager [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 73b58043-e025-48ff-a22a-4d226c545456] Build of instance 73b58043-e025-48ff-a22a-4d226c545456 was re-scheduled: Binding failed for port 287f3ea7-5f8f-462d-b0a4-2bb5a46b7eed, please check neutron logs for more information. {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 745.657561] env[60164]: DEBUG nova.compute.manager [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 73b58043-e025-48ff-a22a-4d226c545456] Unplugging VIFs for instance {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} [ 745.657804] env[60164]: DEBUG oslo_concurrency.lockutils [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Acquiring lock "refresh_cache-73b58043-e025-48ff-a22a-4d226c545456" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 745.657945] env[60164]: DEBUG oslo_concurrency.lockutils [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Acquired lock "refresh_cache-73b58043-e025-48ff-a22a-4d226c545456" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 745.658116] env[60164]: DEBUG nova.network.neutron [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 73b58043-e025-48ff-a22a-4d226c545456] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 745.857246] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e532d7f7-aa4d-4ade-b151-00f2587c6613 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 745.866700] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c731a21-5ad0-4395-8a3e-ab927cb1629d {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 745.898964] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3be1b942-4bb0-45ec-b1b2-9823f820e8e1 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 745.906769] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-114406bf-c1c2-408d-9b1c-91b1f59ceca8 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 745.921757] env[60164]: DEBUG nova.compute.provider_tree [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 745.932209] env[60164]: DEBUG nova.scheduler.client.report [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 745.945360] env[60164]: DEBUG oslo_concurrency.lockutils [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.293s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 745.946116] env[60164]: ERROR nova.compute.manager [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 521c92d4-0858-4e8f-a8a6-4774cc3624ae, please check neutron logs for more information. [ 745.946116] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] Traceback (most recent call last): [ 745.946116] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 745.946116] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] self.driver.spawn(context, instance, image_meta, [ 745.946116] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 745.946116] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] self._vmops.spawn(context, instance, image_meta, injected_files, [ 745.946116] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 745.946116] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] vm_ref = self.build_virtual_machine(instance, [ 745.946116] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 745.946116] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] vif_infos = vmwarevif.get_vif_info(self._session, [ 745.946116] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 745.946486] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] for vif in network_info: [ 745.946486] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 745.946486] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] return self._sync_wrapper(fn, *args, **kwargs) [ 745.946486] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 745.946486] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] self.wait() [ 745.946486] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 745.946486] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] self[:] = self._gt.wait() [ 745.946486] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 745.946486] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] return self._exit_event.wait() [ 745.946486] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 745.946486] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] result = hub.switch() [ 745.946486] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 745.946486] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] return self.greenlet.switch() [ 745.946486] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 745.946844] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] result = function(*args, **kwargs) [ 745.946844] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 745.946844] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] return func(*args, **kwargs) [ 745.946844] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 745.946844] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] raise e [ 745.946844] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 745.946844] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] nwinfo = self.network_api.allocate_for_instance( [ 745.946844] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 745.946844] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] created_port_ids = self._update_ports_for_instance( [ 745.946844] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 745.946844] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] with excutils.save_and_reraise_exception(): [ 745.946844] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 745.946844] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] self.force_reraise() [ 745.947284] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 745.947284] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] raise self.value [ 745.947284] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 745.947284] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] updated_port = self._update_port( [ 745.947284] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 745.947284] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] _ensure_no_port_binding_failure(port) [ 745.947284] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 745.947284] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] raise exception.PortBindingFailed(port_id=port['id']) [ 745.947284] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] nova.exception.PortBindingFailed: Binding failed for port 521c92d4-0858-4e8f-a8a6-4774cc3624ae, please check neutron logs for more information. [ 745.947284] env[60164]: ERROR nova.compute.manager [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] [ 745.947284] env[60164]: DEBUG nova.compute.utils [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] Binding failed for port 521c92d4-0858-4e8f-a8a6-4774cc3624ae, please check neutron logs for more information. {{(pid=60164) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 745.948748] env[60164]: DEBUG nova.compute.manager [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] Build of instance 031e33fa-92ab-483b-ab38-ecf3bbfd1374 was re-scheduled: Binding failed for port 521c92d4-0858-4e8f-a8a6-4774cc3624ae, please check neutron logs for more information. {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 745.949177] env[60164]: DEBUG nova.compute.manager [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] Unplugging VIFs for instance {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} [ 745.949399] env[60164]: DEBUG oslo_concurrency.lockutils [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Acquiring lock "refresh_cache-031e33fa-92ab-483b-ab38-ecf3bbfd1374" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 745.949540] env[60164]: DEBUG oslo_concurrency.lockutils [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Acquired lock "refresh_cache-031e33fa-92ab-483b-ab38-ecf3bbfd1374" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 745.949754] env[60164]: DEBUG nova.network.neutron [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 745.959785] env[60164]: DEBUG nova.network.neutron [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 73b58043-e025-48ff-a22a-4d226c545456] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 746.000967] env[60164]: DEBUG nova.network.neutron [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 746.193343] env[60164]: DEBUG nova.network.neutron [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 746.206454] env[60164]: DEBUG oslo_concurrency.lockutils [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Releasing lock "refresh_cache-031e33fa-92ab-483b-ab38-ecf3bbfd1374" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 746.206701] env[60164]: DEBUG nova.compute.manager [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2984}} [ 746.206874] env[60164]: DEBUG nova.compute.manager [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 746.207052] env[60164]: DEBUG nova.network.neutron [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 746.249072] env[60164]: DEBUG nova.network.neutron [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 746.257925] env[60164]: DEBUG nova.network.neutron [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 746.269932] env[60164]: INFO nova.compute.manager [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 031e33fa-92ab-483b-ab38-ecf3bbfd1374] Took 0.06 seconds to deallocate network for instance. [ 746.380309] env[60164]: INFO nova.scheduler.client.report [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Deleted allocations for instance 031e33fa-92ab-483b-ab38-ecf3bbfd1374 [ 746.401016] env[60164]: DEBUG oslo_concurrency.lockutils [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Lock "031e33fa-92ab-483b-ab38-ecf3bbfd1374" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 14.622s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 746.486714] env[60164]: DEBUG nova.network.neutron [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 73b58043-e025-48ff-a22a-4d226c545456] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 746.500434] env[60164]: DEBUG oslo_concurrency.lockutils [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Releasing lock "refresh_cache-73b58043-e025-48ff-a22a-4d226c545456" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 746.500710] env[60164]: DEBUG nova.compute.manager [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2984}} [ 746.501063] env[60164]: DEBUG nova.compute.manager [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 73b58043-e025-48ff-a22a-4d226c545456] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 746.501300] env[60164]: DEBUG nova.network.neutron [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 73b58043-e025-48ff-a22a-4d226c545456] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 746.555023] env[60164]: DEBUG nova.network.neutron [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 73b58043-e025-48ff-a22a-4d226c545456] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 746.562318] env[60164]: DEBUG nova.network.neutron [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 73b58043-e025-48ff-a22a-4d226c545456] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 746.573531] env[60164]: INFO nova.compute.manager [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] [instance: 73b58043-e025-48ff-a22a-4d226c545456] Took 0.07 seconds to deallocate network for instance. [ 746.680301] env[60164]: INFO nova.scheduler.client.report [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Deleted allocations for instance 73b58043-e025-48ff-a22a-4d226c545456 [ 746.698021] env[60164]: DEBUG oslo_concurrency.lockutils [None req-5f5fea44-936f-46e8-9776-aee45b3037a5 tempest-MultipleCreateTestJSON-1686950820 tempest-MultipleCreateTestJSON-1686950820-project-member] Lock "73b58043-e025-48ff-a22a-4d226c545456" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 14.946s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 746.748135] env[60164]: ERROR nova.compute.manager [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 0b909c60-aba4-4d0b-8134-93a9bbbab5da, please check neutron logs for more information. [ 746.748135] env[60164]: ERROR nova.compute.manager Traceback (most recent call last): [ 746.748135] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 746.748135] env[60164]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 746.748135] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 746.748135] env[60164]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 746.748135] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 746.748135] env[60164]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 746.748135] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 746.748135] env[60164]: ERROR nova.compute.manager self.force_reraise() [ 746.748135] env[60164]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 746.748135] env[60164]: ERROR nova.compute.manager raise self.value [ 746.748135] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 746.748135] env[60164]: ERROR nova.compute.manager updated_port = self._update_port( [ 746.748135] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 746.748135] env[60164]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 746.748644] env[60164]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 746.748644] env[60164]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 746.748644] env[60164]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 0b909c60-aba4-4d0b-8134-93a9bbbab5da, please check neutron logs for more information. [ 746.748644] env[60164]: ERROR nova.compute.manager [ 746.748644] env[60164]: Traceback (most recent call last): [ 746.748644] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 746.748644] env[60164]: listener.cb(fileno) [ 746.748644] env[60164]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 746.748644] env[60164]: result = function(*args, **kwargs) [ 746.748644] env[60164]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 746.748644] env[60164]: return func(*args, **kwargs) [ 746.748644] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 746.748644] env[60164]: raise e [ 746.748644] env[60164]: File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 746.748644] env[60164]: nwinfo = self.network_api.allocate_for_instance( [ 746.748644] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 746.748644] env[60164]: created_port_ids = self._update_ports_for_instance( [ 746.748644] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 746.748644] env[60164]: with excutils.save_and_reraise_exception(): [ 746.748644] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 746.748644] env[60164]: self.force_reraise() [ 746.748644] env[60164]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 746.748644] env[60164]: raise self.value [ 746.748644] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 746.748644] env[60164]: updated_port = self._update_port( [ 746.748644] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 746.748644] env[60164]: _ensure_no_port_binding_failure(port) [ 746.748644] env[60164]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 746.748644] env[60164]: raise exception.PortBindingFailed(port_id=port['id']) [ 746.749772] env[60164]: nova.exception.PortBindingFailed: Binding failed for port 0b909c60-aba4-4d0b-8134-93a9bbbab5da, please check neutron logs for more information. [ 746.749772] env[60164]: Removing descriptor: 20 [ 746.749772] env[60164]: ERROR nova.compute.manager [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 0b909c60-aba4-4d0b-8134-93a9bbbab5da, please check neutron logs for more information. [ 746.749772] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] Traceback (most recent call last): [ 746.749772] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] File "/opt/stack/nova/nova/compute/manager.py", line 2849, in _build_resources [ 746.749772] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] yield resources [ 746.749772] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 746.749772] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] self.driver.spawn(context, instance, image_meta, [ 746.749772] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 746.749772] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 746.749772] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 746.749772] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] vm_ref = self.build_virtual_machine(instance, [ 746.750176] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 746.750176] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] vif_infos = vmwarevif.get_vif_info(self._session, [ 746.750176] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 746.750176] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] for vif in network_info: [ 746.750176] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 746.750176] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] return self._sync_wrapper(fn, *args, **kwargs) [ 746.750176] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 746.750176] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] self.wait() [ 746.750176] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 746.750176] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] self[:] = self._gt.wait() [ 746.750176] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 746.750176] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] return self._exit_event.wait() [ 746.750176] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 746.750176] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] result = hub.switch() [ 746.750577] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 746.750577] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] return self.greenlet.switch() [ 746.750577] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 746.750577] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] result = function(*args, **kwargs) [ 746.750577] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 746.750577] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] return func(*args, **kwargs) [ 746.750577] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 746.750577] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] raise e [ 746.750577] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 746.750577] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] nwinfo = self.network_api.allocate_for_instance( [ 746.750577] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 746.750577] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] created_port_ids = self._update_ports_for_instance( [ 746.750577] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 746.750902] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] with excutils.save_and_reraise_exception(): [ 746.750902] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 746.750902] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] self.force_reraise() [ 746.750902] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 746.750902] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] raise self.value [ 746.750902] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 746.750902] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] updated_port = self._update_port( [ 746.750902] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 746.750902] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] _ensure_no_port_binding_failure(port) [ 746.750902] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 746.750902] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] raise exception.PortBindingFailed(port_id=port['id']) [ 746.750902] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] nova.exception.PortBindingFailed: Binding failed for port 0b909c60-aba4-4d0b-8134-93a9bbbab5da, please check neutron logs for more information. [ 746.750902] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] [ 746.751252] env[60164]: INFO nova.compute.manager [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] Terminating instance [ 746.751252] env[60164]: DEBUG oslo_concurrency.lockutils [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] Acquiring lock "refresh_cache-ad70ab2b-17e2-4cf1-9411-272aec5bfb8a" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 746.751252] env[60164]: DEBUG oslo_concurrency.lockutils [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] Acquired lock "refresh_cache-ad70ab2b-17e2-4cf1-9411-272aec5bfb8a" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 746.751352] env[60164]: DEBUG nova.network.neutron [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 747.000346] env[60164]: DEBUG nova.network.neutron [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 747.421238] env[60164]: DEBUG nova.network.neutron [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 747.429685] env[60164]: DEBUG oslo_concurrency.lockutils [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] Releasing lock "refresh_cache-ad70ab2b-17e2-4cf1-9411-272aec5bfb8a" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 747.436057] env[60164]: DEBUG nova.compute.manager [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] Start destroying the instance on the hypervisor. {{(pid=60164) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} [ 747.436057] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] Destroying instance {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 747.436057] env[60164]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-a33404c9-70e7-40e3-83e9-167834652089 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 747.447551] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a2ec946-6e88-4160-b784-b21c8d0aeff7 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 747.477604] env[60164]: WARNING nova.virt.vmwareapi.vmops [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ad70ab2b-17e2-4cf1-9411-272aec5bfb8a could not be found. [ 747.477954] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] Instance destroyed {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 747.480018] env[60164]: INFO nova.compute.manager [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] Took 0.04 seconds to destroy the instance on the hypervisor. [ 747.480018] env[60164]: DEBUG oslo.service.loopingcall [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60164) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 747.480018] env[60164]: DEBUG nova.compute.manager [-] [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 747.480018] env[60164]: DEBUG nova.network.neutron [-] [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 747.543039] env[60164]: DEBUG nova.network.neutron [-] [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 747.553274] env[60164]: DEBUG nova.network.neutron [-] [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 747.567774] env[60164]: INFO nova.compute.manager [-] [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] Took 0.09 seconds to deallocate network for instance. [ 747.571341] env[60164]: DEBUG nova.compute.claims [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] Aborting claim: {{(pid=60164) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 747.571531] env[60164]: DEBUG oslo_concurrency.lockutils [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 747.572238] env[60164]: DEBUG oslo_concurrency.lockutils [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 747.709441] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd17de26-e747-4206-bc6a-3b59766f1df2 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 747.716927] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32bff81d-3331-429d-9a9c-ad3d0fe18642 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 747.751516] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49e1a137-31bc-4926-878c-df3ee3acde9d {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 747.761241] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f1e29c3-e452-456d-8085-9aa4484f61f8 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 747.774805] env[60164]: DEBUG nova.compute.provider_tree [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 747.784264] env[60164]: DEBUG nova.scheduler.client.report [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 747.796783] env[60164]: DEBUG oslo_concurrency.lockutils [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.225s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 747.797406] env[60164]: ERROR nova.compute.manager [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 0b909c60-aba4-4d0b-8134-93a9bbbab5da, please check neutron logs for more information. [ 747.797406] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] Traceback (most recent call last): [ 747.797406] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 747.797406] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] self.driver.spawn(context, instance, image_meta, [ 747.797406] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 747.797406] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 747.797406] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 747.797406] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] vm_ref = self.build_virtual_machine(instance, [ 747.797406] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 747.797406] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] vif_infos = vmwarevif.get_vif_info(self._session, [ 747.797406] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 747.797811] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] for vif in network_info: [ 747.797811] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 747.797811] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] return self._sync_wrapper(fn, *args, **kwargs) [ 747.797811] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 747.797811] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] self.wait() [ 747.797811] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 747.797811] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] self[:] = self._gt.wait() [ 747.797811] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 747.797811] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] return self._exit_event.wait() [ 747.797811] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 747.797811] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] result = hub.switch() [ 747.797811] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 747.797811] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] return self.greenlet.switch() [ 747.797811] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 747.798241] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] result = function(*args, **kwargs) [ 747.798241] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 747.798241] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] return func(*args, **kwargs) [ 747.798241] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] File "/opt/stack/nova/nova/compute/manager.py", line 1987, in _allocate_network_async [ 747.798241] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] raise e [ 747.798241] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] File "/opt/stack/nova/nova/compute/manager.py", line 1965, in _allocate_network_async [ 747.798241] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] nwinfo = self.network_api.allocate_for_instance( [ 747.798241] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] File "/opt/stack/nova/nova/network/neutron.py", line 1216, in allocate_for_instance [ 747.798241] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] created_port_ids = self._update_ports_for_instance( [ 747.798241] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] File "/opt/stack/nova/nova/network/neutron.py", line 1352, in _update_ports_for_instance [ 747.798241] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] with excutils.save_and_reraise_exception(): [ 747.798241] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 747.798241] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] self.force_reraise() [ 747.798780] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 747.798780] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] raise self.value [ 747.798780] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] File "/opt/stack/nova/nova/network/neutron.py", line 1327, in _update_ports_for_instance [ 747.798780] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] updated_port = self._update_port( [ 747.798780] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 747.798780] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] _ensure_no_port_binding_failure(port) [ 747.798780] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 747.798780] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] raise exception.PortBindingFailed(port_id=port['id']) [ 747.798780] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] nova.exception.PortBindingFailed: Binding failed for port 0b909c60-aba4-4d0b-8134-93a9bbbab5da, please check neutron logs for more information. [ 747.798780] env[60164]: ERROR nova.compute.manager [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] [ 747.798780] env[60164]: DEBUG nova.compute.utils [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] Binding failed for port 0b909c60-aba4-4d0b-8134-93a9bbbab5da, please check neutron logs for more information. {{(pid=60164) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 747.799967] env[60164]: DEBUG nova.compute.manager [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] Build of instance ad70ab2b-17e2-4cf1-9411-272aec5bfb8a was re-scheduled: Binding failed for port 0b909c60-aba4-4d0b-8134-93a9bbbab5da, please check neutron logs for more information. {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 747.800407] env[60164]: DEBUG nova.compute.manager [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] Unplugging VIFs for instance {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} [ 747.800648] env[60164]: DEBUG oslo_concurrency.lockutils [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] Acquiring lock "refresh_cache-ad70ab2b-17e2-4cf1-9411-272aec5bfb8a" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 747.800799] env[60164]: DEBUG oslo_concurrency.lockutils [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] Acquired lock "refresh_cache-ad70ab2b-17e2-4cf1-9411-272aec5bfb8a" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 747.800967] env[60164]: DEBUG nova.network.neutron [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 748.023365] env[60164]: DEBUG nova.network.neutron [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 748.446169] env[60164]: DEBUG nova.network.neutron [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 748.465500] env[60164]: DEBUG oslo_concurrency.lockutils [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] Releasing lock "refresh_cache-ad70ab2b-17e2-4cf1-9411-272aec5bfb8a" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 748.465747] env[60164]: DEBUG nova.compute.manager [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2984}} [ 748.465921] env[60164]: DEBUG nova.compute.manager [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] Deallocating network for instance {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 748.466093] env[60164]: DEBUG nova.network.neutron [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] deallocate_for_instance() {{(pid=60164) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1782}} [ 748.517342] env[60164]: DEBUG nova.network.neutron [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 748.527534] env[60164]: DEBUG nova.network.neutron [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 748.543561] env[60164]: INFO nova.compute.manager [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] [instance: ad70ab2b-17e2-4cf1-9411-272aec5bfb8a] Took 0.08 seconds to deallocate network for instance. [ 748.655270] env[60164]: INFO nova.scheduler.client.report [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] Deleted allocations for instance ad70ab2b-17e2-4cf1-9411-272aec5bfb8a [ 748.681756] env[60164]: DEBUG oslo_concurrency.lockutils [None req-04286880-fda9-4f78-826e-74ba0d7d7596 tempest-ServersNegativeTestJSON-589792489 tempest-ServersNegativeTestJSON-589792489-project-member] Lock "ad70ab2b-17e2-4cf1-9411-272aec5bfb8a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 14.947s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 775.722563] env[60164]: WARNING oslo_vmware.rw_handles [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 775.722563] env[60164]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 775.722563] env[60164]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 775.722563] env[60164]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 775.722563] env[60164]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 775.722563] env[60164]: ERROR oslo_vmware.rw_handles response.begin() [ 775.722563] env[60164]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 775.722563] env[60164]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 775.722563] env[60164]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 775.722563] env[60164]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 775.722563] env[60164]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 775.722563] env[60164]: ERROR oslo_vmware.rw_handles [ 775.722563] env[60164]: DEBUG nova.virt.vmwareapi.images [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] [instance: 68545276-63f2-4baf-8110-d3cc71686682] Downloaded image file data 1618eb55-f00d-42a5-b978-e81e57855fb4 to vmware_temp/85f47989-aaca-47a7-8633-ddb9e0a8d4eb/1618eb55-f00d-42a5-b978-e81e57855fb4/tmp-sparse.vmdk on the data store datastore1 {{(pid=60164) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 775.723370] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] [instance: 68545276-63f2-4baf-8110-d3cc71686682] Caching image {{(pid=60164) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 775.723370] env[60164]: DEBUG nova.virt.vmwareapi.vm_util [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Copying Virtual Disk [datastore1] vmware_temp/85f47989-aaca-47a7-8633-ddb9e0a8d4eb/1618eb55-f00d-42a5-b978-e81e57855fb4/tmp-sparse.vmdk to [datastore1] vmware_temp/85f47989-aaca-47a7-8633-ddb9e0a8d4eb/1618eb55-f00d-42a5-b978-e81e57855fb4/1618eb55-f00d-42a5-b978-e81e57855fb4.vmdk {{(pid=60164) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 775.723458] env[60164]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-b0ce924d-6f79-400a-b7af-206a95116917 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 775.735047] env[60164]: DEBUG oslo_vmware.api [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Waiting for the task: (returnval){ [ 775.735047] env[60164]: value = "task-1295460" [ 775.735047] env[60164]: _type = "Task" [ 775.735047] env[60164]: } to complete. {{(pid=60164) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 775.743463] env[60164]: DEBUG oslo_vmware.api [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Task: {'id': task-1295460, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60164) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 776.247394] env[60164]: DEBUG oslo_vmware.exceptions [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Fault InvalidArgument not matched. {{(pid=60164) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 776.247823] env[60164]: DEBUG oslo_concurrency.lockutils [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Releasing lock "[datastore1] devstack-image-cache_base/1618eb55-f00d-42a5-b978-e81e57855fb4/1618eb55-f00d-42a5-b978-e81e57855fb4.vmdk" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 776.248241] env[60164]: ERROR nova.compute.manager [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] [instance: 68545276-63f2-4baf-8110-d3cc71686682] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 776.248241] env[60164]: Faults: ['InvalidArgument'] [ 776.248241] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] Traceback (most recent call last): [ 776.248241] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] File "/opt/stack/nova/nova/compute/manager.py", line 2849, in _build_resources [ 776.248241] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] yield resources [ 776.248241] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 776.248241] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] self.driver.spawn(context, instance, image_meta, [ 776.248241] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 776.248241] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] self._vmops.spawn(context, instance, image_meta, injected_files, [ 776.248241] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 776.248241] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] self._fetch_image_if_missing(context, vi) [ 776.248241] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 776.248680] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] image_cache(vi, tmp_image_ds_loc) [ 776.248680] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 776.248680] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] vm_util.copy_virtual_disk( [ 776.248680] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 776.248680] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] session._wait_for_task(vmdk_copy_task) [ 776.248680] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 776.248680] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] return self.wait_for_task(task_ref) [ 776.248680] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 776.248680] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] return evt.wait() [ 776.248680] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 776.248680] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] result = hub.switch() [ 776.248680] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 776.248680] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] return self.greenlet.switch() [ 776.249071] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 776.249071] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] self.f(*self.args, **self.kw) [ 776.249071] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 776.249071] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] raise exceptions.translate_fault(task_info.error) [ 776.249071] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 776.249071] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] Faults: ['InvalidArgument'] [ 776.249071] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] [ 776.249071] env[60164]: INFO nova.compute.manager [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] [instance: 68545276-63f2-4baf-8110-d3cc71686682] Terminating instance [ 776.250393] env[60164]: DEBUG oslo_concurrency.lockutils [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Acquired lock "[datastore1] devstack-image-cache_base/1618eb55-f00d-42a5-b978-e81e57855fb4/1618eb55-f00d-42a5-b978-e81e57855fb4.vmdk" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 776.250593] env[60164]: DEBUG nova.virt.vmwareapi.ds_util [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60164) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 776.251841] env[60164]: DEBUG oslo_concurrency.lockutils [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Acquiring lock "refresh_cache-68545276-63f2-4baf-8110-d3cc71686682" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 776.252033] env[60164]: DEBUG oslo_concurrency.lockutils [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Acquired lock "refresh_cache-68545276-63f2-4baf-8110-d3cc71686682" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 776.252408] env[60164]: DEBUG nova.network.neutron [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] [instance: 68545276-63f2-4baf-8110-d3cc71686682] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 776.253422] env[60164]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-46889d15-f9d7-4931-81be-d51ab80f8a5a {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 776.265697] env[60164]: DEBUG nova.virt.vmwareapi.ds_util [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60164) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 776.266275] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60164) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 776.267592] env[60164]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ac389ae8-2907-4a13-9b4d-1f569ae4acb2 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 776.274925] env[60164]: DEBUG oslo_vmware.api [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Waiting for the task: (returnval){ [ 776.274925] env[60164]: value = "session[528ca5dc-e009-fd53-4682-e6b571cb4de5]528f474a-03e4-1db0-9fb2-708268443632" [ 776.274925] env[60164]: _type = "Task" [ 776.274925] env[60164]: } to complete. {{(pid=60164) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 776.283859] env[60164]: DEBUG oslo_vmware.api [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Task: {'id': session[528ca5dc-e009-fd53-4682-e6b571cb4de5]528f474a-03e4-1db0-9fb2-708268443632, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60164) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 776.286924] env[60164]: DEBUG nova.network.neutron [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] [instance: 68545276-63f2-4baf-8110-d3cc71686682] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 776.366137] env[60164]: DEBUG nova.network.neutron [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] [instance: 68545276-63f2-4baf-8110-d3cc71686682] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 776.378020] env[60164]: DEBUG oslo_concurrency.lockutils [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Releasing lock "refresh_cache-68545276-63f2-4baf-8110-d3cc71686682" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 776.378020] env[60164]: DEBUG nova.compute.manager [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] [instance: 68545276-63f2-4baf-8110-d3cc71686682] Start destroying the instance on the hypervisor. {{(pid=60164) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} [ 776.378020] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] [instance: 68545276-63f2-4baf-8110-d3cc71686682] Destroying instance {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 776.378020] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f7d24df-8945-4a38-a160-396f27402bd8 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 776.388291] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] [instance: 68545276-63f2-4baf-8110-d3cc71686682] Unregistering the VM {{(pid=60164) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 776.388291] env[60164]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-672d0da2-6c82-4227-98a9-3e9896633597 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 776.426175] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] [instance: 68545276-63f2-4baf-8110-d3cc71686682] Unregistered the VM {{(pid=60164) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 776.426175] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] [instance: 68545276-63f2-4baf-8110-d3cc71686682] Deleting contents of the VM from datastore datastore1 {{(pid=60164) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 776.426175] env[60164]: DEBUG nova.virt.vmwareapi.ds_util [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Deleting the datastore file [datastore1] 68545276-63f2-4baf-8110-d3cc71686682 {{(pid=60164) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 776.426175] env[60164]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-884a0121-ea5c-4172-8fac-dd5df7093512 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 776.437100] env[60164]: DEBUG oslo_vmware.api [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Waiting for the task: (returnval){ [ 776.437100] env[60164]: value = "task-1295462" [ 776.437100] env[60164]: _type = "Task" [ 776.437100] env[60164]: } to complete. {{(pid=60164) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 776.444496] env[60164]: DEBUG oslo_vmware.api [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Task: {'id': task-1295462, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60164) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 776.784461] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] Preparing fetch location {{(pid=60164) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 776.784812] env[60164]: DEBUG nova.virt.vmwareapi.ds_util [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Creating directory with path [datastore1] vmware_temp/51d94923-3eb5-4be4-9598-c9cf3f3c1f24/1618eb55-f00d-42a5-b978-e81e57855fb4 {{(pid=60164) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 776.785055] env[60164]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-faea7258-5b2e-4667-aa9a-d12ca7dfb677 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 776.797925] env[60164]: DEBUG nova.virt.vmwareapi.ds_util [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Created directory with path [datastore1] vmware_temp/51d94923-3eb5-4be4-9598-c9cf3f3c1f24/1618eb55-f00d-42a5-b978-e81e57855fb4 {{(pid=60164) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 776.798178] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] Fetch image to [datastore1] vmware_temp/51d94923-3eb5-4be4-9598-c9cf3f3c1f24/1618eb55-f00d-42a5-b978-e81e57855fb4/tmp-sparse.vmdk {{(pid=60164) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 776.798359] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] Downloading image file data 1618eb55-f00d-42a5-b978-e81e57855fb4 to [datastore1] vmware_temp/51d94923-3eb5-4be4-9598-c9cf3f3c1f24/1618eb55-f00d-42a5-b978-e81e57855fb4/tmp-sparse.vmdk on the data store datastore1 {{(pid=60164) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 776.799139] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a7e7ed8-fcfb-42a0-8fd6-7cf0d60b9363 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 776.813710] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ea47d74-bf1c-43c0-a529-a7d3981d27a3 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 776.828769] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab9cdec4-a655-4b4b-8411-299aec447d0f {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 776.865973] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0467c210-06b9-4905-8a80-84473aede2c6 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 776.874011] env[60164]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-5d268e6a-c856-4955-a71a-f434b2a8ae9d {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 776.946357] env[60164]: DEBUG oslo_vmware.api [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Task: {'id': task-1295462, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.050316} completed successfully. {{(pid=60164) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 776.946658] env[60164]: DEBUG nova.virt.vmwareapi.ds_util [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Deleted the datastore file {{(pid=60164) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 776.947423] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] [instance: 68545276-63f2-4baf-8110-d3cc71686682] Deleted contents of the VM from datastore datastore1 {{(pid=60164) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 776.947423] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] [instance: 68545276-63f2-4baf-8110-d3cc71686682] Instance destroyed {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 776.947423] env[60164]: INFO nova.compute.manager [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] [instance: 68545276-63f2-4baf-8110-d3cc71686682] Took 0.57 seconds to destroy the instance on the hypervisor. [ 776.947542] env[60164]: DEBUG oslo.service.loopingcall [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60164) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 776.947861] env[60164]: DEBUG nova.compute.manager [-] [instance: 68545276-63f2-4baf-8110-d3cc71686682] Skipping network deallocation for instance since networking was not requested. {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2260}} [ 776.950494] env[60164]: DEBUG nova.compute.claims [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] [instance: 68545276-63f2-4baf-8110-d3cc71686682] Aborting claim: {{(pid=60164) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 776.950698] env[60164]: DEBUG oslo_concurrency.lockutils [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 776.950977] env[60164]: DEBUG oslo_concurrency.lockutils [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 776.973376] env[60164]: DEBUG nova.virt.vmwareapi.images [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] Downloading image file data 1618eb55-f00d-42a5-b978-e81e57855fb4 to the data store datastore1 {{(pid=60164) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 777.041454] env[60164]: DEBUG oslo_vmware.rw_handles [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/51d94923-3eb5-4be4-9598-c9cf3f3c1f24/1618eb55-f00d-42a5-b978-e81e57855fb4/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60164) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 777.109370] env[60164]: DEBUG oslo_vmware.rw_handles [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Completed reading data from the image iterator. {{(pid=60164) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 777.109478] env[60164]: DEBUG oslo_vmware.rw_handles [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/51d94923-3eb5-4be4-9598-c9cf3f3c1f24/1618eb55-f00d-42a5-b978-e81e57855fb4/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60164) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 777.122253] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f75e806-c38b-4c2e-b93b-65802556e77a {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 777.128824] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1fe480d1-f768-41d9-bb23-c5cf9192afbd {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 777.164988] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65684107-ff1b-48ff-b643-f73a932e2d44 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 777.171328] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09f98116-7f06-4978-9c56-ccb00cbf79c2 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 777.188329] env[60164]: DEBUG nova.compute.provider_tree [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Updating inventory in ProviderTree for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 138, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 777.222037] env[60164]: ERROR nova.scheduler.client.report [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] [req-30705555-c269-48f4-a44c-d60336fbd7ad] Failed to update inventory to [{'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 138, 'step_size': 1, 'allocation_ratio': 1.0}}] for resource provider with UUID ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f. Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict ", "code": "placement.concurrent_update", "request_id": "req-30705555-c269-48f4-a44c-d60336fbd7ad"}]}: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 777.239592] env[60164]: DEBUG nova.scheduler.client.report [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Refreshing inventories for resource provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 777.255775] env[60164]: DEBUG nova.scheduler.client.report [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Updating ProviderTree inventory for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 777.255980] env[60164]: DEBUG nova.compute.provider_tree [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Updating inventory in ProviderTree for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 777.270519] env[60164]: DEBUG nova.scheduler.client.report [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Refreshing aggregate associations for resource provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f, aggregates: None {{(pid=60164) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 777.296170] env[60164]: DEBUG nova.scheduler.client.report [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Refreshing trait associations for resource provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f, traits: COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NET_ATTACH_INTERFACE {{(pid=60164) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 777.375362] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12b02c2d-caba-4ea5-9543-30f3526fdb45 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 777.385252] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f90cc45-230c-43da-a2da-055620d8a577 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 777.422341] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71b3eda5-673c-4c57-ae1a-7ae98341b556 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 777.430830] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dbbaf9ce-7479-4856-923c-eed1fdee0d1b {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 777.447577] env[60164]: DEBUG nova.compute.provider_tree [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Updating inventory in ProviderTree for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 138, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 777.485590] env[60164]: DEBUG nova.scheduler.client.report [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Updated inventory for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f with generation 47 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 138, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 777.485740] env[60164]: DEBUG nova.compute.provider_tree [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Updating resource provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f generation from 47 to 48 during operation: update_inventory {{(pid=60164) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 777.485883] env[60164]: DEBUG nova.compute.provider_tree [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Updating inventory in ProviderTree for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 138, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 777.504430] env[60164]: DEBUG oslo_concurrency.lockutils [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.553s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 777.505012] env[60164]: ERROR nova.compute.manager [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] [instance: 68545276-63f2-4baf-8110-d3cc71686682] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 777.505012] env[60164]: Faults: ['InvalidArgument'] [ 777.505012] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] Traceback (most recent call last): [ 777.505012] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 777.505012] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] self.driver.spawn(context, instance, image_meta, [ 777.505012] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 777.505012] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] self._vmops.spawn(context, instance, image_meta, injected_files, [ 777.505012] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 777.505012] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] self._fetch_image_if_missing(context, vi) [ 777.505012] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 777.505012] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] image_cache(vi, tmp_image_ds_loc) [ 777.505012] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 777.505404] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] vm_util.copy_virtual_disk( [ 777.505404] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 777.505404] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] session._wait_for_task(vmdk_copy_task) [ 777.505404] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 777.505404] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] return self.wait_for_task(task_ref) [ 777.505404] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 777.505404] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] return evt.wait() [ 777.505404] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 777.505404] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] result = hub.switch() [ 777.505404] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 777.505404] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] return self.greenlet.switch() [ 777.505404] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 777.505404] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] self.f(*self.args, **self.kw) [ 777.505765] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 777.505765] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] raise exceptions.translate_fault(task_info.error) [ 777.505765] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 777.505765] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] Faults: ['InvalidArgument'] [ 777.505765] env[60164]: ERROR nova.compute.manager [instance: 68545276-63f2-4baf-8110-d3cc71686682] [ 777.505765] env[60164]: DEBUG nova.compute.utils [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] [instance: 68545276-63f2-4baf-8110-d3cc71686682] VimFaultException {{(pid=60164) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 777.507383] env[60164]: DEBUG nova.compute.manager [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] [instance: 68545276-63f2-4baf-8110-d3cc71686682] Build of instance 68545276-63f2-4baf-8110-d3cc71686682 was re-scheduled: A specified parameter was not correct: fileType [ 777.507383] env[60164]: Faults: ['InvalidArgument'] {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 777.507745] env[60164]: DEBUG nova.compute.manager [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] [instance: 68545276-63f2-4baf-8110-d3cc71686682] Unplugging VIFs for instance {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} [ 777.508322] env[60164]: DEBUG oslo_concurrency.lockutils [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Acquiring lock "refresh_cache-68545276-63f2-4baf-8110-d3cc71686682" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 777.508508] env[60164]: DEBUG oslo_concurrency.lockutils [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Acquired lock "refresh_cache-68545276-63f2-4baf-8110-d3cc71686682" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 777.508720] env[60164]: DEBUG nova.network.neutron [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] [instance: 68545276-63f2-4baf-8110-d3cc71686682] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 777.547519] env[60164]: DEBUG nova.network.neutron [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] [instance: 68545276-63f2-4baf-8110-d3cc71686682] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 777.673531] env[60164]: DEBUG nova.network.neutron [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] [instance: 68545276-63f2-4baf-8110-d3cc71686682] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 777.692247] env[60164]: DEBUG oslo_concurrency.lockutils [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Releasing lock "refresh_cache-68545276-63f2-4baf-8110-d3cc71686682" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 777.693494] env[60164]: DEBUG nova.compute.manager [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2984}} [ 777.693494] env[60164]: DEBUG nova.compute.manager [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] [instance: 68545276-63f2-4baf-8110-d3cc71686682] Skipping network deallocation for instance since networking was not requested. {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2260}} [ 777.807428] env[60164]: INFO nova.scheduler.client.report [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Deleted allocations for instance 68545276-63f2-4baf-8110-d3cc71686682 [ 777.826379] env[60164]: DEBUG oslo_concurrency.lockutils [None req-95589ff0-2ae0-4df8-9925-f5d63a2665fe tempest-ServerDiagnosticsV248Test-1137552807 tempest-ServerDiagnosticsV248Test-1137552807-project-member] Lock "68545276-63f2-4baf-8110-d3cc71686682" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 97.417s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 785.890963] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 787.882231] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 787.886720] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 787.886895] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 787.898930] env[60164]: DEBUG oslo_concurrency.lockutils [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 787.899198] env[60164]: DEBUG oslo_concurrency.lockutils [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 787.899368] env[60164]: DEBUG oslo_concurrency.lockutils [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 787.899524] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60164) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 787.900676] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50ba47f3-bc8a-4407-b5f6-72a4ce9cc961 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 787.910371] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-410f2007-c4ff-498f-b8f0-3a09bcb2bbca {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 787.925230] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32e2def8-fe63-45b1-9f83-de58639215d3 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 787.932694] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f1cea50-5fd6-454e-a55f-d9d9d51a86a6 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 787.963260] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181479MB free_disk=139GB free_vcpus=48 pci_devices=None {{(pid=60164) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 787.963461] env[60164]: DEBUG oslo_concurrency.lockutils [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 787.963577] env[60164]: DEBUG oslo_concurrency.lockutils [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 788.006690] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Instance b1361aa5-9bbd-4891-b74f-a0afd90b0bd6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60164) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 788.006859] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Instance 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60164) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 788.006985] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Instance 946c73f8-1ed8-4180-a9d7-0b2970c4367e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60164) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 788.007183] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Total usable vcpus: 48, total allocated vcpus: 3 {{(pid=60164) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 788.007323] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=896MB phys_disk=200GB used_disk=3GB total_vcpus=48 used_vcpus=3 pci_stats=[] {{(pid=60164) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 788.055527] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fdf004a7-4761-4569-b603-80e3eae939c3 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 788.064739] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce1d6a8b-7dfe-440c-b1b2-eaa1a867e16a {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 788.096774] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5084913-0144-423d-bc3e-b2205d0bc724 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 788.104899] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f4c6fe08-5d34-4faa-b86b-0951945a55bf {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 788.122728] env[60164]: DEBUG nova.compute.provider_tree [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Updating inventory in ProviderTree for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 788.155647] env[60164]: DEBUG nova.scheduler.client.report [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Updated inventory for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f with generation 48 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 788.155882] env[60164]: DEBUG nova.compute.provider_tree [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Updating resource provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f generation from 48 to 49 during operation: update_inventory {{(pid=60164) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 788.156044] env[60164]: DEBUG nova.compute.provider_tree [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Updating inventory in ProviderTree for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 788.169198] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60164) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 788.169370] env[60164]: DEBUG oslo_concurrency.lockutils [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.206s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 789.170175] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 789.170470] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 789.170566] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60164) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10408}} [ 789.882053] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 789.897205] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 789.897369] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Starting heal instance info cache {{(pid=60164) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9789}} [ 789.897477] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Rebuilding the list of instances to heal {{(pid=60164) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9793}} [ 789.910540] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] Skipping network cache update for instance because it is Building. {{(pid=60164) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9802}} [ 789.910689] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] Skipping network cache update for instance because it is Building. {{(pid=60164) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9802}} [ 789.910827] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] Skipping network cache update for instance because it is Building. {{(pid=60164) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9802}} [ 789.910983] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Didn't find any instances for network info cache update. {{(pid=60164) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9875}} [ 789.911383] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 789.911552] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 823.419997] env[60164]: WARNING oslo_vmware.rw_handles [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 823.419997] env[60164]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 823.419997] env[60164]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 823.419997] env[60164]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 823.419997] env[60164]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 823.419997] env[60164]: ERROR oslo_vmware.rw_handles response.begin() [ 823.419997] env[60164]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 823.419997] env[60164]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 823.419997] env[60164]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 823.419997] env[60164]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 823.419997] env[60164]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 823.419997] env[60164]: ERROR oslo_vmware.rw_handles [ 823.420675] env[60164]: DEBUG nova.virt.vmwareapi.images [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] Downloaded image file data 1618eb55-f00d-42a5-b978-e81e57855fb4 to vmware_temp/51d94923-3eb5-4be4-9598-c9cf3f3c1f24/1618eb55-f00d-42a5-b978-e81e57855fb4/tmp-sparse.vmdk on the data store datastore1 {{(pid=60164) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 823.422181] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] Caching image {{(pid=60164) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 823.422438] env[60164]: DEBUG nova.virt.vmwareapi.vm_util [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Copying Virtual Disk [datastore1] vmware_temp/51d94923-3eb5-4be4-9598-c9cf3f3c1f24/1618eb55-f00d-42a5-b978-e81e57855fb4/tmp-sparse.vmdk to [datastore1] vmware_temp/51d94923-3eb5-4be4-9598-c9cf3f3c1f24/1618eb55-f00d-42a5-b978-e81e57855fb4/1618eb55-f00d-42a5-b978-e81e57855fb4.vmdk {{(pid=60164) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 823.422717] env[60164]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-618ee4f8-7208-4e99-bc96-1327f5aebdd9 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 823.430266] env[60164]: DEBUG oslo_vmware.api [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Waiting for the task: (returnval){ [ 823.430266] env[60164]: value = "task-1295467" [ 823.430266] env[60164]: _type = "Task" [ 823.430266] env[60164]: } to complete. {{(pid=60164) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 823.439557] env[60164]: DEBUG oslo_vmware.api [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Task: {'id': task-1295467, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60164) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 823.940292] env[60164]: DEBUG oslo_vmware.exceptions [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Fault InvalidArgument not matched. {{(pid=60164) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 823.940614] env[60164]: DEBUG oslo_concurrency.lockutils [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Releasing lock "[datastore1] devstack-image-cache_base/1618eb55-f00d-42a5-b978-e81e57855fb4/1618eb55-f00d-42a5-b978-e81e57855fb4.vmdk" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 823.941198] env[60164]: ERROR nova.compute.manager [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 823.941198] env[60164]: Faults: ['InvalidArgument'] [ 823.941198] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] Traceback (most recent call last): [ 823.941198] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] File "/opt/stack/nova/nova/compute/manager.py", line 2849, in _build_resources [ 823.941198] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] yield resources [ 823.941198] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 823.941198] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] self.driver.spawn(context, instance, image_meta, [ 823.941198] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 823.941198] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 823.941198] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 823.941198] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] self._fetch_image_if_missing(context, vi) [ 823.941198] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 823.941620] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] image_cache(vi, tmp_image_ds_loc) [ 823.941620] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 823.941620] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] vm_util.copy_virtual_disk( [ 823.941620] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 823.941620] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] session._wait_for_task(vmdk_copy_task) [ 823.941620] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 823.941620] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] return self.wait_for_task(task_ref) [ 823.941620] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 823.941620] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] return evt.wait() [ 823.941620] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 823.941620] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] result = hub.switch() [ 823.941620] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 823.941620] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] return self.greenlet.switch() [ 823.942065] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 823.942065] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] self.f(*self.args, **self.kw) [ 823.942065] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 823.942065] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] raise exceptions.translate_fault(task_info.error) [ 823.942065] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 823.942065] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] Faults: ['InvalidArgument'] [ 823.942065] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] [ 823.942065] env[60164]: INFO nova.compute.manager [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] Terminating instance [ 823.943362] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Acquired lock "[datastore1] devstack-image-cache_base/1618eb55-f00d-42a5-b978-e81e57855fb4/1618eb55-f00d-42a5-b978-e81e57855fb4.vmdk" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 823.944366] env[60164]: DEBUG nova.virt.vmwareapi.ds_util [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60164) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 823.944852] env[60164]: DEBUG oslo_concurrency.lockutils [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Acquiring lock "refresh_cache-b1361aa5-9bbd-4891-b74f-a0afd90b0bd6" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 823.945070] env[60164]: DEBUG oslo_concurrency.lockutils [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Acquired lock "refresh_cache-b1361aa5-9bbd-4891-b74f-a0afd90b0bd6" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 823.945190] env[60164]: DEBUG nova.network.neutron [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 823.946142] env[60164]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-01eb002e-feb3-472e-ac8f-e8c54cf41cc9 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 823.956289] env[60164]: DEBUG nova.virt.vmwareapi.ds_util [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60164) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 823.956487] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60164) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 823.957449] env[60164]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b4357083-95c1-4bb1-bcc0-7c0630271372 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 823.962622] env[60164]: DEBUG oslo_vmware.api [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Waiting for the task: (returnval){ [ 823.962622] env[60164]: value = "session[528ca5dc-e009-fd53-4682-e6b571cb4de5]521c84b9-ba70-fb51-a75d-eefe49963f85" [ 823.962622] env[60164]: _type = "Task" [ 823.962622] env[60164]: } to complete. {{(pid=60164) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 823.970446] env[60164]: DEBUG oslo_vmware.api [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Task: {'id': session[528ca5dc-e009-fd53-4682-e6b571cb4de5]521c84b9-ba70-fb51-a75d-eefe49963f85, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60164) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 823.971565] env[60164]: DEBUG nova.network.neutron [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 824.030707] env[60164]: DEBUG nova.network.neutron [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 824.039353] env[60164]: DEBUG oslo_concurrency.lockutils [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Releasing lock "refresh_cache-b1361aa5-9bbd-4891-b74f-a0afd90b0bd6" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 824.039815] env[60164]: DEBUG nova.compute.manager [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] Start destroying the instance on the hypervisor. {{(pid=60164) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} [ 824.040013] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] Destroying instance {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 824.041073] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-110882d6-3243-4928-ac25-954791e1abfa {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 824.049127] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] Unregistering the VM {{(pid=60164) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 824.049338] env[60164]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-f38b4ebc-954c-4403-a4e1-d6d86aeabd19 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 824.083025] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] Unregistered the VM {{(pid=60164) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 824.083250] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] Deleting contents of the VM from datastore datastore1 {{(pid=60164) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 824.083428] env[60164]: DEBUG nova.virt.vmwareapi.ds_util [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Deleting the datastore file [datastore1] b1361aa5-9bbd-4891-b74f-a0afd90b0bd6 {{(pid=60164) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 824.083663] env[60164]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-fb00906c-6929-4a7f-baab-fc8f948130f0 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 824.089883] env[60164]: DEBUG oslo_vmware.api [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Waiting for the task: (returnval){ [ 824.089883] env[60164]: value = "task-1295469" [ 824.089883] env[60164]: _type = "Task" [ 824.089883] env[60164]: } to complete. {{(pid=60164) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 824.097062] env[60164]: DEBUG oslo_vmware.api [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Task: {'id': task-1295469, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60164) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 824.473385] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] Preparing fetch location {{(pid=60164) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 824.473734] env[60164]: DEBUG nova.virt.vmwareapi.ds_util [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Creating directory with path [datastore1] vmware_temp/cf4227dd-95fd-44f2-a3c5-f05c509e6f6b/1618eb55-f00d-42a5-b978-e81e57855fb4 {{(pid=60164) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 824.473854] env[60164]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f627f386-7e95-42b0-82e3-09cd9d8b3377 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 824.484999] env[60164]: DEBUG nova.virt.vmwareapi.ds_util [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Created directory with path [datastore1] vmware_temp/cf4227dd-95fd-44f2-a3c5-f05c509e6f6b/1618eb55-f00d-42a5-b978-e81e57855fb4 {{(pid=60164) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 824.485241] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] Fetch image to [datastore1] vmware_temp/cf4227dd-95fd-44f2-a3c5-f05c509e6f6b/1618eb55-f00d-42a5-b978-e81e57855fb4/tmp-sparse.vmdk {{(pid=60164) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 824.485452] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] Downloading image file data 1618eb55-f00d-42a5-b978-e81e57855fb4 to [datastore1] vmware_temp/cf4227dd-95fd-44f2-a3c5-f05c509e6f6b/1618eb55-f00d-42a5-b978-e81e57855fb4/tmp-sparse.vmdk on the data store datastore1 {{(pid=60164) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 824.486165] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6379dc8e-4a5c-4404-b976-94015aa2af3a {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 824.492631] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd093188-730c-4549-bc9e-1abeff04c1d6 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 824.501379] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9728b099-f3aa-4333-a7df-b440b7d394f6 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 824.532704] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1183180-227b-40ec-b25b-8b1cb3a86f94 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 824.538071] env[60164]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-61774ed0-3aa9-4641-ae06-28c1cbaad2c1 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 824.556762] env[60164]: DEBUG nova.virt.vmwareapi.images [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] Downloading image file data 1618eb55-f00d-42a5-b978-e81e57855fb4 to the data store datastore1 {{(pid=60164) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 824.598637] env[60164]: DEBUG oslo_vmware.api [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Task: {'id': task-1295469, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.049121} completed successfully. {{(pid=60164) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 824.599641] env[60164]: DEBUG oslo_vmware.rw_handles [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/cf4227dd-95fd-44f2-a3c5-f05c509e6f6b/1618eb55-f00d-42a5-b978-e81e57855fb4/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60164) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 824.601206] env[60164]: DEBUG nova.virt.vmwareapi.ds_util [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Deleted the datastore file {{(pid=60164) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 824.601439] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] Deleted contents of the VM from datastore datastore1 {{(pid=60164) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 824.601614] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] Instance destroyed {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 824.601781] env[60164]: INFO nova.compute.manager [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] Took 0.56 seconds to destroy the instance on the hypervisor. [ 824.602015] env[60164]: DEBUG oslo.service.loopingcall [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60164) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 824.650122] env[60164]: DEBUG nova.compute.manager [-] [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] Skipping network deallocation for instance since networking was not requested. {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2260}} [ 824.652773] env[60164]: DEBUG nova.compute.claims [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] Aborting claim: {{(pid=60164) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 824.652960] env[60164]: DEBUG oslo_concurrency.lockutils [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 824.653208] env[60164]: DEBUG oslo_concurrency.lockutils [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 824.657604] env[60164]: DEBUG oslo_vmware.rw_handles [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Completed reading data from the image iterator. {{(pid=60164) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 824.657791] env[60164]: DEBUG oslo_vmware.rw_handles [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/cf4227dd-95fd-44f2-a3c5-f05c509e6f6b/1618eb55-f00d-42a5-b978-e81e57855fb4/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60164) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 824.734732] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f860b2df-dcd8-4769-88b1-1e77cb918039 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 824.742264] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7930ea55-fbbe-4b8c-ad3c-fc85879ffb32 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 824.771864] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87dc2371-09ef-4478-bf29-fe2a922a103e {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 824.778732] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6505ba6-6cd1-4f34-ae22-ee03ee9ba71b {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 824.791405] env[60164]: DEBUG nova.compute.provider_tree [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 824.799929] env[60164]: DEBUG nova.scheduler.client.report [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 824.813474] env[60164]: DEBUG oslo_concurrency.lockutils [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.160s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 824.814078] env[60164]: ERROR nova.compute.manager [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 824.814078] env[60164]: Faults: ['InvalidArgument'] [ 824.814078] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] Traceback (most recent call last): [ 824.814078] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 824.814078] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] self.driver.spawn(context, instance, image_meta, [ 824.814078] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 824.814078] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 824.814078] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 824.814078] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] self._fetch_image_if_missing(context, vi) [ 824.814078] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 824.814078] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] image_cache(vi, tmp_image_ds_loc) [ 824.814078] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 824.814517] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] vm_util.copy_virtual_disk( [ 824.814517] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 824.814517] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] session._wait_for_task(vmdk_copy_task) [ 824.814517] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 824.814517] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] return self.wait_for_task(task_ref) [ 824.814517] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 824.814517] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] return evt.wait() [ 824.814517] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 824.814517] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] result = hub.switch() [ 824.814517] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 824.814517] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] return self.greenlet.switch() [ 824.814517] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 824.814517] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] self.f(*self.args, **self.kw) [ 824.814917] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 824.814917] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] raise exceptions.translate_fault(task_info.error) [ 824.814917] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 824.814917] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] Faults: ['InvalidArgument'] [ 824.814917] env[60164]: ERROR nova.compute.manager [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] [ 824.815297] env[60164]: DEBUG nova.compute.utils [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] VimFaultException {{(pid=60164) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 824.816639] env[60164]: DEBUG nova.compute.manager [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] Build of instance b1361aa5-9bbd-4891-b74f-a0afd90b0bd6 was re-scheduled: A specified parameter was not correct: fileType [ 824.816639] env[60164]: Faults: ['InvalidArgument'] {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 824.817097] env[60164]: DEBUG nova.compute.manager [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] Unplugging VIFs for instance {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} [ 824.817371] env[60164]: DEBUG oslo_concurrency.lockutils [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Acquiring lock "refresh_cache-b1361aa5-9bbd-4891-b74f-a0afd90b0bd6" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 824.817564] env[60164]: DEBUG oslo_concurrency.lockutils [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Acquired lock "refresh_cache-b1361aa5-9bbd-4891-b74f-a0afd90b0bd6" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 824.817772] env[60164]: DEBUG nova.network.neutron [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 824.840484] env[60164]: DEBUG nova.network.neutron [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 824.897976] env[60164]: DEBUG nova.network.neutron [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 824.906650] env[60164]: DEBUG oslo_concurrency.lockutils [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Releasing lock "refresh_cache-b1361aa5-9bbd-4891-b74f-a0afd90b0bd6" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 824.906873] env[60164]: DEBUG nova.compute.manager [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2984}} [ 824.907074] env[60164]: DEBUG nova.compute.manager [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] [instance: b1361aa5-9bbd-4891-b74f-a0afd90b0bd6] Skipping network deallocation for instance since networking was not requested. {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2260}} [ 824.987574] env[60164]: INFO nova.scheduler.client.report [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Deleted allocations for instance b1361aa5-9bbd-4891-b74f-a0afd90b0bd6 [ 825.003154] env[60164]: DEBUG oslo_concurrency.lockutils [None req-e12515a2-ed4a-472a-90a9-ba6e4491eab9 tempest-ServerShowV247Test-916977799 tempest-ServerShowV247Test-916977799-project-member] Lock "b1361aa5-9bbd-4891-b74f-a0afd90b0bd6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 144.449s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 845.887667] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 845.888119] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 845.888119] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Cleaning up deleted instances {{(pid=60164) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11076}} [ 845.901581] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] There are 0 instances to clean {{(pid=60164) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11085}} [ 845.901771] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 845.901928] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Cleaning up deleted instances with incomplete migration {{(pid=60164) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11114}} [ 845.912998] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 847.918262] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 849.883835] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 849.887435] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 849.887595] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Starting heal instance info cache {{(pid=60164) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9789}} [ 849.887719] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Rebuilding the list of instances to heal {{(pid=60164) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9793}} [ 849.899196] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] Skipping network cache update for instance because it is Building. {{(pid=60164) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9802}} [ 849.899352] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] Skipping network cache update for instance because it is Building. {{(pid=60164) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9802}} [ 849.899464] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Didn't find any instances for network info cache update. {{(pid=60164) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9875}} [ 849.899900] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 849.900085] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 849.900218] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60164) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10408}} [ 849.900360] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 849.909112] env[60164]: DEBUG oslo_concurrency.lockutils [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 849.909338] env[60164]: DEBUG oslo_concurrency.lockutils [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 849.909508] env[60164]: DEBUG oslo_concurrency.lockutils [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 849.909708] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60164) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 849.911241] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f516536-2748-4132-87c7-45a820439127 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 849.920103] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f902315e-eac2-40ca-b019-2e30e6b969db {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 849.933974] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d51e28c-e80c-499c-a102-182037e702d0 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 849.940018] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bce7cc86-ecd6-41ed-b6a7-601597e91377 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 849.970144] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181445MB free_disk=139GB free_vcpus=48 pci_devices=None {{(pid=60164) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 849.970282] env[60164]: DEBUG oslo_concurrency.lockutils [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 849.970450] env[60164]: DEBUG oslo_concurrency.lockutils [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 850.088914] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Instance 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60164) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 850.089132] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Instance 946c73f8-1ed8-4180-a9d7-0b2970c4367e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60164) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 850.089341] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Total usable vcpus: 48, total allocated vcpus: 2 {{(pid=60164) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 850.089483] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=768MB phys_disk=200GB used_disk=2GB total_vcpus=48 used_vcpus=2 pci_stats=[] {{(pid=60164) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 850.124859] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-efdf506a-5e84-4db7-8acf-7ebab56f61b8 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 850.132317] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a97ae9f8-d7f3-4378-8ca4-a0136ff20670 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 850.160830] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92ed6d9d-8342-4eea-9c7c-76937bafb61d {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 850.167186] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4304060-ac29-4ef4-a0e2-aafc72cbc284 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 850.179419] env[60164]: DEBUG nova.compute.provider_tree [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 850.187138] env[60164]: DEBUG nova.scheduler.client.report [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 850.202277] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60164) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 850.202440] env[60164]: DEBUG oslo_concurrency.lockutils [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.232s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 851.190540] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 851.190955] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 871.940487] env[60164]: WARNING oslo_vmware.rw_handles [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 871.940487] env[60164]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 871.940487] env[60164]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 871.940487] env[60164]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 871.940487] env[60164]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 871.940487] env[60164]: ERROR oslo_vmware.rw_handles response.begin() [ 871.940487] env[60164]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 871.940487] env[60164]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 871.940487] env[60164]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 871.940487] env[60164]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 871.940487] env[60164]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 871.940487] env[60164]: ERROR oslo_vmware.rw_handles [ 871.940487] env[60164]: DEBUG nova.virt.vmwareapi.images [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] Downloaded image file data 1618eb55-f00d-42a5-b978-e81e57855fb4 to vmware_temp/cf4227dd-95fd-44f2-a3c5-f05c509e6f6b/1618eb55-f00d-42a5-b978-e81e57855fb4/tmp-sparse.vmdk on the data store datastore1 {{(pid=60164) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 871.941362] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] Caching image {{(pid=60164) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 871.941496] env[60164]: DEBUG nova.virt.vmwareapi.vm_util [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Copying Virtual Disk [datastore1] vmware_temp/cf4227dd-95fd-44f2-a3c5-f05c509e6f6b/1618eb55-f00d-42a5-b978-e81e57855fb4/tmp-sparse.vmdk to [datastore1] vmware_temp/cf4227dd-95fd-44f2-a3c5-f05c509e6f6b/1618eb55-f00d-42a5-b978-e81e57855fb4/1618eb55-f00d-42a5-b978-e81e57855fb4.vmdk {{(pid=60164) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 871.941788] env[60164]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-33f89474-ac78-4964-95bd-881f1c4f70ef {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 871.949607] env[60164]: DEBUG oslo_vmware.api [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Waiting for the task: (returnval){ [ 871.949607] env[60164]: value = "task-1295470" [ 871.949607] env[60164]: _type = "Task" [ 871.949607] env[60164]: } to complete. {{(pid=60164) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 871.958490] env[60164]: DEBUG oslo_vmware.api [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Task: {'id': task-1295470, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60164) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 872.460663] env[60164]: DEBUG oslo_vmware.exceptions [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Fault InvalidArgument not matched. {{(pid=60164) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 872.460985] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Releasing lock "[datastore1] devstack-image-cache_base/1618eb55-f00d-42a5-b978-e81e57855fb4/1618eb55-f00d-42a5-b978-e81e57855fb4.vmdk" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 872.461533] env[60164]: ERROR nova.compute.manager [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 872.461533] env[60164]: Faults: ['InvalidArgument'] [ 872.461533] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] Traceback (most recent call last): [ 872.461533] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] File "/opt/stack/nova/nova/compute/manager.py", line 2849, in _build_resources [ 872.461533] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] yield resources [ 872.461533] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 872.461533] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] self.driver.spawn(context, instance, image_meta, [ 872.461533] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 872.461533] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 872.461533] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 872.461533] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] self._fetch_image_if_missing(context, vi) [ 872.461533] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 872.462398] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] image_cache(vi, tmp_image_ds_loc) [ 872.462398] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 872.462398] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] vm_util.copy_virtual_disk( [ 872.462398] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 872.462398] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] session._wait_for_task(vmdk_copy_task) [ 872.462398] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 872.462398] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] return self.wait_for_task(task_ref) [ 872.462398] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 872.462398] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] return evt.wait() [ 872.462398] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 872.462398] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] result = hub.switch() [ 872.462398] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 872.462398] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] return self.greenlet.switch() [ 872.463210] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 872.463210] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] self.f(*self.args, **self.kw) [ 872.463210] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 872.463210] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] raise exceptions.translate_fault(task_info.error) [ 872.463210] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 872.463210] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] Faults: ['InvalidArgument'] [ 872.463210] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] [ 872.463210] env[60164]: INFO nova.compute.manager [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] Terminating instance [ 872.463578] env[60164]: DEBUG oslo_concurrency.lockutils [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Acquired lock "[datastore1] devstack-image-cache_base/1618eb55-f00d-42a5-b978-e81e57855fb4/1618eb55-f00d-42a5-b978-e81e57855fb4.vmdk" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 872.463733] env[60164]: DEBUG nova.virt.vmwareapi.ds_util [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60164) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 872.464080] env[60164]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-39abec08-0c33-4f0a-ae21-89e4e141938d {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 872.466397] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Acquiring lock "refresh_cache-6c8194c3-68fd-4ffc-a0fa-f23c8935bee6" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 872.466540] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Acquired lock "refresh_cache-6c8194c3-68fd-4ffc-a0fa-f23c8935bee6" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 872.466697] env[60164]: DEBUG nova.network.neutron [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 872.473143] env[60164]: DEBUG nova.virt.vmwareapi.ds_util [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60164) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 872.473311] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60164) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 872.474479] env[60164]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9fe8aed2-8822-40de-b26a-777b31303159 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 872.481547] env[60164]: DEBUG oslo_vmware.api [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Waiting for the task: (returnval){ [ 872.481547] env[60164]: value = "session[528ca5dc-e009-fd53-4682-e6b571cb4de5]52989e98-cea2-3f4a-da27-76ff20186689" [ 872.481547] env[60164]: _type = "Task" [ 872.481547] env[60164]: } to complete. {{(pid=60164) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 872.488746] env[60164]: DEBUG oslo_vmware.api [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Task: {'id': session[528ca5dc-e009-fd53-4682-e6b571cb4de5]52989e98-cea2-3f4a-da27-76ff20186689, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60164) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 872.494303] env[60164]: DEBUG nova.network.neutron [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 872.551464] env[60164]: DEBUG nova.network.neutron [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 872.560104] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Releasing lock "refresh_cache-6c8194c3-68fd-4ffc-a0fa-f23c8935bee6" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 872.560486] env[60164]: DEBUG nova.compute.manager [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] Start destroying the instance on the hypervisor. {{(pid=60164) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} [ 872.560681] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] Destroying instance {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 872.561674] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3472115-6440-49fd-b97e-e615b0c7e8ba {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 872.569046] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] Unregistering the VM {{(pid=60164) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 872.569263] env[60164]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-5bace7c2-68f6-4e51-b8ce-2376400ba211 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 872.595140] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] Unregistered the VM {{(pid=60164) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 872.595354] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] Deleting contents of the VM from datastore datastore1 {{(pid=60164) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 872.595530] env[60164]: DEBUG nova.virt.vmwareapi.ds_util [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Deleting the datastore file [datastore1] 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6 {{(pid=60164) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 872.595770] env[60164]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-06f43cb3-2e4f-4779-a4f5-e68ee850501b {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 872.601356] env[60164]: DEBUG oslo_vmware.api [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Waiting for the task: (returnval){ [ 872.601356] env[60164]: value = "task-1295472" [ 872.601356] env[60164]: _type = "Task" [ 872.601356] env[60164]: } to complete. {{(pid=60164) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 872.608793] env[60164]: DEBUG oslo_vmware.api [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Task: {'id': task-1295472, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60164) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 872.991667] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] Preparing fetch location {{(pid=60164) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 872.991935] env[60164]: DEBUG nova.virt.vmwareapi.ds_util [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Creating directory with path [datastore1] vmware_temp/d6150208-0515-45c5-aad1-ed5ddd31e308/1618eb55-f00d-42a5-b978-e81e57855fb4 {{(pid=60164) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 872.992163] env[60164]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ce8dccda-dce0-4bfd-920b-5f0ef94157d1 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 873.003253] env[60164]: DEBUG nova.virt.vmwareapi.ds_util [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Created directory with path [datastore1] vmware_temp/d6150208-0515-45c5-aad1-ed5ddd31e308/1618eb55-f00d-42a5-b978-e81e57855fb4 {{(pid=60164) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 873.003429] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] Fetch image to [datastore1] vmware_temp/d6150208-0515-45c5-aad1-ed5ddd31e308/1618eb55-f00d-42a5-b978-e81e57855fb4/tmp-sparse.vmdk {{(pid=60164) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 873.003605] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] Downloading image file data 1618eb55-f00d-42a5-b978-e81e57855fb4 to [datastore1] vmware_temp/d6150208-0515-45c5-aad1-ed5ddd31e308/1618eb55-f00d-42a5-b978-e81e57855fb4/tmp-sparse.vmdk on the data store datastore1 {{(pid=60164) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 873.004354] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cfb192c4-0bc3-4a50-ac4e-ad65ad7f41b2 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 873.010920] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e3e8b0e-7946-4926-9d80-d5c22c4beaac {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 873.019571] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98c8c42b-dcb5-4339-8865-77f43e083e71 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 873.050620] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63adcf3d-6673-4348-bf4a-19f5331363c4 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 873.055651] env[60164]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-307410de-8698-4b01-bcf5-0080fb966ba5 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 873.109552] env[60164]: DEBUG oslo_vmware.api [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Task: {'id': task-1295472, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.042189} completed successfully. {{(pid=60164) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 873.109817] env[60164]: DEBUG nova.virt.vmwareapi.ds_util [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Deleted the datastore file {{(pid=60164) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 873.110018] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] Deleted contents of the VM from datastore datastore1 {{(pid=60164) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 873.110196] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] Instance destroyed {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 873.110364] env[60164]: INFO nova.compute.manager [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] Took 0.55 seconds to destroy the instance on the hypervisor. [ 873.110586] env[60164]: DEBUG oslo.service.loopingcall [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60164) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 873.110780] env[60164]: DEBUG nova.compute.manager [-] [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] Skipping network deallocation for instance since networking was not requested. {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2260}} [ 873.112835] env[60164]: DEBUG nova.compute.claims [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] Aborting claim: {{(pid=60164) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 873.113026] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 873.113236] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 873.140379] env[60164]: DEBUG nova.virt.vmwareapi.images [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] Downloading image file data 1618eb55-f00d-42a5-b978-e81e57855fb4 to the data store datastore1 {{(pid=60164) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 873.185607] env[60164]: DEBUG oslo_vmware.rw_handles [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d6150208-0515-45c5-aad1-ed5ddd31e308/1618eb55-f00d-42a5-b978-e81e57855fb4/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60164) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 873.187813] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-54bf68f5-e6fc-45ad-9d0f-44e727b9e7e6 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 873.243628] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-160f6e9c-63a2-4180-b9de-e082b041abee {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 873.247180] env[60164]: DEBUG oslo_vmware.rw_handles [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Completed reading data from the image iterator. {{(pid=60164) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 873.247343] env[60164]: DEBUG oslo_vmware.rw_handles [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d6150208-0515-45c5-aad1-ed5ddd31e308/1618eb55-f00d-42a5-b978-e81e57855fb4/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60164) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 873.274072] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d090db57-1e7a-454c-a3f9-97292cc349f8 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 873.281393] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d2a5128-31f5-4e50-b3a9-be80dd191e91 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 873.294379] env[60164]: DEBUG nova.compute.provider_tree [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 873.302530] env[60164]: DEBUG nova.scheduler.client.report [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 873.314977] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.202s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 873.315498] env[60164]: ERROR nova.compute.manager [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 873.315498] env[60164]: Faults: ['InvalidArgument'] [ 873.315498] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] Traceback (most recent call last): [ 873.315498] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 873.315498] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] self.driver.spawn(context, instance, image_meta, [ 873.315498] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 873.315498] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 873.315498] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 873.315498] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] self._fetch_image_if_missing(context, vi) [ 873.315498] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 873.315498] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] image_cache(vi, tmp_image_ds_loc) [ 873.315498] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 873.315974] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] vm_util.copy_virtual_disk( [ 873.315974] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 873.315974] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] session._wait_for_task(vmdk_copy_task) [ 873.315974] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 873.315974] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] return self.wait_for_task(task_ref) [ 873.315974] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 873.315974] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] return evt.wait() [ 873.315974] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 873.315974] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] result = hub.switch() [ 873.315974] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 873.315974] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] return self.greenlet.switch() [ 873.315974] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 873.315974] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] self.f(*self.args, **self.kw) [ 873.316453] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 873.316453] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] raise exceptions.translate_fault(task_info.error) [ 873.316453] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 873.316453] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] Faults: ['InvalidArgument'] [ 873.316453] env[60164]: ERROR nova.compute.manager [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] [ 873.316453] env[60164]: DEBUG nova.compute.utils [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] VimFaultException {{(pid=60164) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 873.317470] env[60164]: DEBUG nova.compute.manager [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] Build of instance 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6 was re-scheduled: A specified parameter was not correct: fileType [ 873.317470] env[60164]: Faults: ['InvalidArgument'] {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 873.317861] env[60164]: DEBUG nova.compute.manager [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] Unplugging VIFs for instance {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} [ 873.318096] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Acquiring lock "refresh_cache-6c8194c3-68fd-4ffc-a0fa-f23c8935bee6" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 873.318293] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Acquired lock "refresh_cache-6c8194c3-68fd-4ffc-a0fa-f23c8935bee6" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 873.318387] env[60164]: DEBUG nova.network.neutron [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 873.340580] env[60164]: DEBUG nova.network.neutron [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 873.394144] env[60164]: DEBUG nova.network.neutron [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 873.402110] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Releasing lock "refresh_cache-6c8194c3-68fd-4ffc-a0fa-f23c8935bee6" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 873.402316] env[60164]: DEBUG nova.compute.manager [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2984}} [ 873.402488] env[60164]: DEBUG nova.compute.manager [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] [instance: 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6] Skipping network deallocation for instance since networking was not requested. {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2260}} [ 873.482134] env[60164]: INFO nova.scheduler.client.report [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Deleted allocations for instance 6c8194c3-68fd-4ffc-a0fa-f23c8935bee6 [ 873.497641] env[60164]: DEBUG oslo_concurrency.lockutils [None req-d96c0ced-f062-463d-ae0d-4400b83d8169 tempest-ServerShowV254Test-893530536 tempest-ServerShowV254Test-893530536-project-member] Lock "6c8194c3-68fd-4ffc-a0fa-f23c8935bee6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 188.272s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 907.888625] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 907.889079] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 909.884041] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 909.884533] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 909.895269] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 909.895436] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60164) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10408}} [ 910.887819] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 910.888307] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 910.897730] env[60164]: DEBUG oslo_concurrency.lockutils [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 910.897926] env[60164]: DEBUG oslo_concurrency.lockutils [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 910.898106] env[60164]: DEBUG oslo_concurrency.lockutils [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 910.898262] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60164) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 910.899390] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-921090e6-dd4d-4e5d-971d-a24248db0d71 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 910.908133] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8fc1682-99b1-4d7f-8c2a-010e6514f0b5 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 910.921644] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41bf6f17-e395-41dd-a574-72ade8b2aeae {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 910.928261] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca04172d-d97b-47cc-a2d8-bd86f17a6ddf {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 910.958225] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181497MB free_disk=139GB free_vcpus=48 pci_devices=None {{(pid=60164) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 910.958372] env[60164]: DEBUG oslo_concurrency.lockutils [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 910.958545] env[60164]: DEBUG oslo_concurrency.lockutils [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 910.995441] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Instance 946c73f8-1ed8-4180-a9d7-0b2970c4367e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60164) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 910.995669] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Total usable vcpus: 48, total allocated vcpus: 1 {{(pid=60164) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 910.995815] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=640MB phys_disk=200GB used_disk=1GB total_vcpus=48 used_vcpus=1 pci_stats=[] {{(pid=60164) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 911.021830] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc9f6133-daa2-4e8a-933b-e92224949791 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 911.029448] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4eabce7-1160-404d-a118-e816729c54dd {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 911.058310] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ddf81ec2-297b-4d01-b226-5006eae1f3b8 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 911.065145] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-008e6c01-fd06-49ec-8d59-5972999f839a {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 911.077611] env[60164]: DEBUG nova.compute.provider_tree [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 911.085590] env[60164]: DEBUG nova.scheduler.client.report [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 911.097441] env[60164]: DEBUG nova.compute.resource_tracker [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60164) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 911.097604] env[60164]: DEBUG oslo_concurrency.lockutils [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.139s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 912.098405] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 912.098799] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Starting heal instance info cache {{(pid=60164) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9789}} [ 912.098799] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Rebuilding the list of instances to heal {{(pid=60164) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9793}} [ 912.108704] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] Skipping network cache update for instance because it is Building. {{(pid=60164) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9802}} [ 912.108860] env[60164]: DEBUG nova.compute.manager [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Didn't find any instances for network info cache update. {{(pid=60164) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9875}} [ 912.109049] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 912.887754] env[60164]: DEBUG oslo_service.periodic_task [None req-ed156dec-397c-455b-9740-451d876eb328 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60164) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 918.603357] env[60164]: WARNING oslo_vmware.rw_handles [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 918.603357] env[60164]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 918.603357] env[60164]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 918.603357] env[60164]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 918.603357] env[60164]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 918.603357] env[60164]: ERROR oslo_vmware.rw_handles response.begin() [ 918.603357] env[60164]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 918.603357] env[60164]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 918.603357] env[60164]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 918.603357] env[60164]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 918.603357] env[60164]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 918.603357] env[60164]: ERROR oslo_vmware.rw_handles [ 918.603974] env[60164]: DEBUG nova.virt.vmwareapi.images [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] Downloaded image file data 1618eb55-f00d-42a5-b978-e81e57855fb4 to vmware_temp/d6150208-0515-45c5-aad1-ed5ddd31e308/1618eb55-f00d-42a5-b978-e81e57855fb4/tmp-sparse.vmdk on the data store datastore1 {{(pid=60164) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 918.605551] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] Caching image {{(pid=60164) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 918.605814] env[60164]: DEBUG nova.virt.vmwareapi.vm_util [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Copying Virtual Disk [datastore1] vmware_temp/d6150208-0515-45c5-aad1-ed5ddd31e308/1618eb55-f00d-42a5-b978-e81e57855fb4/tmp-sparse.vmdk to [datastore1] vmware_temp/d6150208-0515-45c5-aad1-ed5ddd31e308/1618eb55-f00d-42a5-b978-e81e57855fb4/1618eb55-f00d-42a5-b978-e81e57855fb4.vmdk {{(pid=60164) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 918.606184] env[60164]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-a417f12b-4525-4140-8597-9f1fa14c3a5e {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 918.615182] env[60164]: DEBUG oslo_vmware.api [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Waiting for the task: (returnval){ [ 918.615182] env[60164]: value = "task-1295473" [ 918.615182] env[60164]: _type = "Task" [ 918.615182] env[60164]: } to complete. {{(pid=60164) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 918.622547] env[60164]: DEBUG oslo_vmware.api [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Task: {'id': task-1295473, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60164) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 919.125286] env[60164]: DEBUG oslo_vmware.exceptions [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Fault InvalidArgument not matched. {{(pid=60164) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 919.125514] env[60164]: DEBUG oslo_concurrency.lockutils [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Releasing lock "[datastore1] devstack-image-cache_base/1618eb55-f00d-42a5-b978-e81e57855fb4/1618eb55-f00d-42a5-b978-e81e57855fb4.vmdk" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 919.126063] env[60164]: ERROR nova.compute.manager [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 919.126063] env[60164]: Faults: ['InvalidArgument'] [ 919.126063] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] Traceback (most recent call last): [ 919.126063] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] File "/opt/stack/nova/nova/compute/manager.py", line 2849, in _build_resources [ 919.126063] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] yield resources [ 919.126063] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 919.126063] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] self.driver.spawn(context, instance, image_meta, [ 919.126063] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 919.126063] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 919.126063] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 919.126063] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] self._fetch_image_if_missing(context, vi) [ 919.126063] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 919.126639] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] image_cache(vi, tmp_image_ds_loc) [ 919.126639] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 919.126639] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] vm_util.copy_virtual_disk( [ 919.126639] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 919.126639] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] session._wait_for_task(vmdk_copy_task) [ 919.126639] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 919.126639] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] return self.wait_for_task(task_ref) [ 919.126639] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 919.126639] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] return evt.wait() [ 919.126639] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 919.126639] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] result = hub.switch() [ 919.126639] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 919.126639] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] return self.greenlet.switch() [ 919.127099] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 919.127099] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] self.f(*self.args, **self.kw) [ 919.127099] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 919.127099] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] raise exceptions.translate_fault(task_info.error) [ 919.127099] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 919.127099] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] Faults: ['InvalidArgument'] [ 919.127099] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] [ 919.127099] env[60164]: INFO nova.compute.manager [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] Terminating instance [ 919.129119] env[60164]: DEBUG oslo_concurrency.lockutils [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Acquiring lock "refresh_cache-946c73f8-1ed8-4180-a9d7-0b2970c4367e" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 919.129273] env[60164]: DEBUG oslo_concurrency.lockutils [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Acquired lock "refresh_cache-946c73f8-1ed8-4180-a9d7-0b2970c4367e" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 919.129434] env[60164]: DEBUG nova.network.neutron [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 919.236139] env[60164]: DEBUG nova.network.neutron [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 919.294038] env[60164]: DEBUG nova.network.neutron [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 919.302920] env[60164]: DEBUG oslo_concurrency.lockutils [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Releasing lock "refresh_cache-946c73f8-1ed8-4180-a9d7-0b2970c4367e" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 919.303322] env[60164]: DEBUG nova.compute.manager [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] Start destroying the instance on the hypervisor. {{(pid=60164) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} [ 919.303504] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] Destroying instance {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 919.304501] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7390c290-a33f-424c-9701-6297d8441bf5 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 919.312087] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] Unregistering the VM {{(pid=60164) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 919.312287] env[60164]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-97d86956-11af-4654-adf8-f83765d4a8c4 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 919.338934] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] Unregistered the VM {{(pid=60164) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 919.339148] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] Deleting contents of the VM from datastore datastore1 {{(pid=60164) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 919.339331] env[60164]: DEBUG nova.virt.vmwareapi.ds_util [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Deleting the datastore file [datastore1] 946c73f8-1ed8-4180-a9d7-0b2970c4367e {{(pid=60164) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 919.339588] env[60164]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-42eda9ee-0aa9-4a1d-977c-b278091c25a0 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 919.345350] env[60164]: DEBUG oslo_vmware.api [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Waiting for the task: (returnval){ [ 919.345350] env[60164]: value = "task-1295475" [ 919.345350] env[60164]: _type = "Task" [ 919.345350] env[60164]: } to complete. {{(pid=60164) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 919.352792] env[60164]: DEBUG oslo_vmware.api [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Task: {'id': task-1295475, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60164) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 919.855310] env[60164]: DEBUG oslo_vmware.api [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Task: {'id': task-1295475, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.034864} completed successfully. {{(pid=60164) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 919.855716] env[60164]: DEBUG nova.virt.vmwareapi.ds_util [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Deleted the datastore file {{(pid=60164) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 919.855765] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] Deleted contents of the VM from datastore datastore1 {{(pid=60164) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 919.855997] env[60164]: DEBUG nova.virt.vmwareapi.vmops [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] Instance destroyed {{(pid=60164) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 919.856197] env[60164]: INFO nova.compute.manager [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] Took 0.55 seconds to destroy the instance on the hypervisor. [ 919.856423] env[60164]: DEBUG oslo.service.loopingcall [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60164) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 919.856616] env[60164]: DEBUG nova.compute.manager [-] [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] Skipping network deallocation for instance since networking was not requested. {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2260}} [ 919.858694] env[60164]: DEBUG nova.compute.claims [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] Aborting claim: {{(pid=60164) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 919.858911] env[60164]: DEBUG oslo_concurrency.lockutils [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 919.859153] env[60164]: DEBUG oslo_concurrency.lockutils [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 919.916493] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-228e7a26-41bf-488e-a2a9-0d6365f917bb {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 919.924070] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2132b7e7-a445-4cb3-990f-bb47805b3164 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 919.953889] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-734f5ead-2bca-42c9-92ba-76a1c7027a97 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 919.961197] env[60164]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db47656f-8c4f-4380-8436-074e64e56849 {{(pid=60164) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 919.976223] env[60164]: DEBUG nova.compute.provider_tree [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Inventory has not changed in ProviderTree for provider: ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f {{(pid=60164) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 919.984394] env[60164]: DEBUG nova.scheduler.client.report [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Inventory has not changed for provider ab7eee2c-d48f-4d45-b0c7-727ddc03ea4f based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 139, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60164) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 919.996833] env[60164]: DEBUG oslo_concurrency.lockutils [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.138s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 919.997378] env[60164]: ERROR nova.compute.manager [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 919.997378] env[60164]: Faults: ['InvalidArgument'] [ 919.997378] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] Traceback (most recent call last): [ 919.997378] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] File "/opt/stack/nova/nova/compute/manager.py", line 2607, in _build_and_run_instance [ 919.997378] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] self.driver.spawn(context, instance, image_meta, [ 919.997378] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 919.997378] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 919.997378] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 919.997378] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] self._fetch_image_if_missing(context, vi) [ 919.997378] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 919.997378] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] image_cache(vi, tmp_image_ds_loc) [ 919.997378] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 919.997762] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] vm_util.copy_virtual_disk( [ 919.997762] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 919.997762] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] session._wait_for_task(vmdk_copy_task) [ 919.997762] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 919.997762] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] return self.wait_for_task(task_ref) [ 919.997762] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 919.997762] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] return evt.wait() [ 919.997762] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 919.997762] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] result = hub.switch() [ 919.997762] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 919.997762] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] return self.greenlet.switch() [ 919.997762] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 919.997762] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] self.f(*self.args, **self.kw) [ 919.998162] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 919.998162] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] raise exceptions.translate_fault(task_info.error) [ 919.998162] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 919.998162] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] Faults: ['InvalidArgument'] [ 919.998162] env[60164]: ERROR nova.compute.manager [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] [ 919.998162] env[60164]: DEBUG nova.compute.utils [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] VimFaultException {{(pid=60164) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 919.999534] env[60164]: DEBUG nova.compute.manager [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] Build of instance 946c73f8-1ed8-4180-a9d7-0b2970c4367e was re-scheduled: A specified parameter was not correct: fileType [ 919.999534] env[60164]: Faults: ['InvalidArgument'] {{(pid=60164) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 919.999934] env[60164]: DEBUG nova.compute.manager [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] Unplugging VIFs for instance {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} [ 920.000170] env[60164]: DEBUG oslo_concurrency.lockutils [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Acquiring lock "refresh_cache-946c73f8-1ed8-4180-a9d7-0b2970c4367e" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 920.000313] env[60164]: DEBUG oslo_concurrency.lockutils [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Acquired lock "refresh_cache-946c73f8-1ed8-4180-a9d7-0b2970c4367e" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 920.000466] env[60164]: DEBUG nova.network.neutron [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] Building network info cache for instance {{(pid=60164) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1989}} [ 920.023017] env[60164]: DEBUG nova.network.neutron [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] Instance cache missing network info. {{(pid=60164) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3302}} [ 920.077988] env[60164]: DEBUG nova.network.neutron [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] Updating instance_info_cache with network_info: [] {{(pid=60164) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 920.085747] env[60164]: DEBUG oslo_concurrency.lockutils [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Releasing lock "refresh_cache-946c73f8-1ed8-4180-a9d7-0b2970c4367e" {{(pid=60164) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 920.085951] env[60164]: DEBUG nova.compute.manager [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60164) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2984}} [ 920.086147] env[60164]: DEBUG nova.compute.manager [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] [instance: 946c73f8-1ed8-4180-a9d7-0b2970c4367e] Skipping network deallocation for instance since networking was not requested. {{(pid=60164) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2260}} [ 920.166621] env[60164]: INFO nova.scheduler.client.report [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Deleted allocations for instance 946c73f8-1ed8-4180-a9d7-0b2970c4367e [ 920.186864] env[60164]: DEBUG oslo_concurrency.lockutils [None req-59a315b4-ab5c-4345-bdbe-05be08a4aa67 tempest-ServersAaction247Test-439800747 tempest-ServersAaction247Test-439800747-project-member] Lock "946c73f8-1ed8-4180-a9d7-0b2970c4367e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 188.695s {{(pid=60164) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}}